https://dagster.io/ logo
#ask-community
Title
# ask-community
g

geoHeil

05/25/2022, 11:35 AM
When I have multiple jobs on the same schedule - how can I configure dagster so they finish (latest) at some defined point in time but do not start at the same point?
🤖 1
s

sean

05/25/2022, 5:27 PM
@prha @daniel
d

daniel

05/25/2022, 5:29 PM
I'm having some trouble understanding the exact details of the request here. Do you have an example of what "finish (latest) at some defined point in time" would look like?
g

geoHeil

05/25/2022, 5:31 PM
Assuming I want to query/extract data from an external service (and do not want to overload it) I cannot query multiple assets which might be provided by the same service at once. However, I want to ensure that everything is finished before some specific time / before a business user wants to access the data for their (daily) work.
d

daniel

05/25/2022, 5:32 PM
Ah, we have ways to allow you to provide constraints to limit the amount of concurrent jobs using a certain tag or resource - but we don't have anything currently that lets you supply a 'this has to be finished by X no matter what' constraint
the topic of supporting built-in SLAS has come up in the past, this has some conceptual overlap with that
g

geoHeil

05/25/2022, 5:34 PM
Agreed. Would it be possible to have 3 jobs seduling every day at 06:00 but somehow having these kick off serially (or at least with limited concurrency).
Would this work (simpler) than using the K8s + celery deployment with a limited number of worker nodes?
d

daniel

05/25/2022, 5:44 PM
yeah, you can apply a limit per-tag and give each run produces by that schedule the same tag: https://docs.dagster.io/deployment/run-coordinator#limiting-run-concurrency
and also prioritize runs within those rules
1
🎉 1
g

geoHeil

05/25/2022, 5:58 PM
this looks great. Thanks
7 Views