How do I limit the number of concurrent ops across...
# ask-community
i
How do I limit the number of concurrent ops across all jobs (i.e., set max_concurrent to 4 without explicitly setting the config for each job)? I looked at this but couldn’t find an example of how to do that
j
Hey @Issac Loo this isn’t currently possible, but you could write a custom @job decorator that wraps the dagster job decorator and adds the concurrency tags
d
You can also configure a default executor on your Definitions object (or repository) :
Copy code
from dagster import multiprocess_executor, Definitions
my_executor = multiprocess_executor.configured({"max_concurrent": 4})

defs = Definitions(..., executor=my_executor)
D 1
🙏 1
i
thanks all!