https://dagster.io/ logo
#ask-community
Title
# ask-community
a

AJ Floersch

06/08/2023, 8:36 PM
How do I increase the number of concurrent ops being executed when using Docker? I do not have any custom concurrency limits specified, and my understanding is those would only limit the total anyway, not increase the amount run concurrently. I ask because what I've noticed, when running locally, I generally will have 10 ops running concurrently within a particular job. When executing the same job in our Docker configuration, it will only process 2 ops concurrently - resulting in much longer overall run times.
z

Zach

06/08/2023, 8:44 PM
For op-level concurrency you may need to adjust the multiprocess executor
max_concurrent
config in your run config
For run-level concurrency you can adjust the
max_concurrent_runs
in your dagster.yaml file if you're in open-source, or in the Dagster Cloud UI or via dagster-cloud CLI - https://docs.dagster.io/guides/limiting-concurrency-in-data-pipelines#limiting-overall-runs
a

AJ Floersch

06/08/2023, 8:57 PM
@Zach Thank you! The first link seemed to solve my issue. This particular line clarified why I was seeing the difference:
By default, or if you set
max_concurrent
to be 0, this is the return value of
python:multiprocessing.cpu_count()
🎉 1
2 Views