Is there a built-in way to set a maximum number of...
# ask-community
b
Is there a built-in way to set a maximum number of queued runs for a given (.e.g) pipeline? Do I need to create a custom launcher for this?
d
Hi Ben - you can set per-tag limits to accomplish this, see the example here: https://docs.dagster.io/deployment/run-coordinator#run-limits
b
My understanding is that this limits the number of concurrent runs, not the number of queued runs, right?
d
Oh, maybe I don't understand the use case. What are you hoping will happen to runs that are submitted when there are already that many queued runs, a failure?
b
Probably a cancellation? My issue is that some jobs are idempotent, so repeating them in unnecessary. It’d be nice to be able to express this in dagster so that certain jobs are queued at most one time. That was my thinking, at least. It sounds like defining a custom coordinator would be the best course of action here?
d
Yeah, a custom run coordinator would probably be the way to go there
b
Alright - thanks for your help, Daniel