Hi guys, I want to create the sensor to trigger th...
# ask-community
j
Hi guys, I want to create the sensor to trigger the dbt job to run whenever table in Snowflake is update? Any idea to do this?
m
We’re currently working on this. I’d be careful as if you try and run some SQL command to get the current state of the table, you turn on a warehouse to do so which runs for a minimum of 1 minute
j
Thanks @Matt Clarke, I'm doing something like this.
Copy code
def get_last_sync_timestamp():
    conn = connect(
        account=SNOWFLAKE_ACCOUNT,
        user=SNOWFLAKE_USER,
        password=SNOWFLAKE_PASSWORD,
        warehouse=SNOWFLAKE_WAREHOUSE,
        database=SNOWFLAKE_DATABASE,
        schema=SNOWFLAKE_SCHEMA,
    )

    try:
        cursor = conn.cursor()
        cursor.execute(
            f"SELECT MAX(TO_TIMESTAMP(UPDATE_STARTED)) FROM {SNOWFLAKE_TABLE} WHERE \"TABLE\" = '{CHECKING_TABLE}';")
        result = cursor.fetchone()
        return result[0] if result[0] else datetime.min
    finally:
        cursor.close()
        conn.close()


@sensor(
    name="snowflake_sensor_run_process",
    job_name="run_from_ingest",
    minimum_interval_seconds=30
)
def snowflake_sensor(context: SensorEvaluationContext):
    last_run_time = context.last_run_key
    #
    if last_run_time is None:
        last_run_time = datetime.min.strftime("%m/%d/%Y, %H:%M:%S")

    last_sync_timestamp = get_last_sync_timestamp().strftime("%m/%d/%Y, %H:%M:%S")

    if last_sync_timestamp > last_run_time:
        <http://context.log.info|context.log.info>(f"New data detected, triggering pipeline at {last_sync_timestamp}")
        return run_request_for_partition(
            partition_key=partitions_def_daily,
            run_key=last_sync_timestamp,
            run_config={},
        )
    else:
        <http://context.log.info|context.log.info>(f"last_run_time: {last_run_time}. "
                         f"last_sync_timestamp: {last_sync_timestamp}. "
                         f"No new data detected")
s
Hi Julius, Are you still looking for a solution to this issue?
j
@sean hi, sorry for late reply. my solution is the script below.
👌 1