https://dagster.io/ logo
#ask-community
Title
# ask-community
a

Alec Ryan

04/21/2022, 12:50 AM
I have 3 jobs: 1. Load data to s3 2. move data from s3 to snowflake 3. build dbt models Jobs 1,2 rely on partition dates to run. dbt cannot rely on a partition_date directly. How can I pass the date from job 2 to dbt? Can I role them all into the same job?
daggy success 1
s

Stephen Bailey

04/21/2022, 1:54 AM
im curious what dbt uses the partition date for?
my first thought was using a sensor that listens for Job 2, and yields a
RunRequest
for Job 3 that has the partition_date in the config somewhere relevant
a

Alec Ryan

04/21/2022, 1:55 AM
I'm using it as a runtime vars to build incremental models one day at a time
👍 1
s

Stephen Bailey

04/21/2022, 1:56 AM
yeah, looks like you could pass the run_config into the job config with runrequests if you were okay using a sensor to trigger it: https://docs.dagster.io/_apidocs/schedules-sensors#run-requests
fwiw, i've been using this sensor-trigger pattern for the past couple weeks and it works quite well.
a

Alec Ryan

04/21/2022, 1:58 AM
Do you know of examples where assets are used?
Generally, seems like the same concept regardless
s

Stephen Bailey

04/21/2022, 1:58 AM
no, not aware of any specific examples
2 Views