Hi team, these are the settings for my `prod` depl...
# dagster-plus
q
Hi team, these are the settings for my
prod
deployment in dagster cloud
Copy code
run_queue:
  max_concurrent_runs: 10
  tag_concurrency_limits: []
run_monitoring:
  start_timeout_seconds: 1200
run_retries:
  max_retries: 0
sso_default_role: VIEWER
non_isolated_runs:
  max_concurrent_non_isolated_runs: 1
I run a job locally and it executed 10 of them concurrently but in prod, I am only seeing 2 concurrent tasks. We have a hybrid deployment -- GCP. Am I missing anything?
d
Hi Qwame - when you say "2 concurrent tasks", can you define what exactly that means ? is that tasks within a single run?
Or is it that there are only 2 runs in progress at once?
q
Yes, tasks within a single run
There were 10 concurrent tasks when I run this job locally
dagster dev
but in prod I only see 2 concurrent tasks. the job uses
DynamicOut
d
You can use the
max_concurrent
config field to set the maximum number of ops within a single run to execute: https://docs.dagster.io/concepts/ops-jobs-graphs/job-execution#default-job-executor - the default value is determined based on the number of CPUs on your machine which is likely why its smaller in prod
q
Do I need to set this for each job or the dagster-cloud.yaml file can have these settings as a default for our deployment
d
You can configure the default executor to use in your Definitions object or repository object, including that configuration - but it has to be defined in code currently
like in the example here: https://docs.dagster.io/deployment/executors#for-a-code-location
Copy code
defs = Definitions(
    assets=[the_asset], jobs=[asset_job, op_job], executor=multiprocess_executor.configured({"max_concurrent": 10})
)
ty spinny 1
q
Hmm, sweet!
Let me give this a try, thanks!
@daniel This worked. I didn't have to add additional nodes. Running 15 concurrent tasks and it's looking good. Thanks for your help!
condagster 1