Nicholas Pezolano
05/26/2023, 2:49 AMmy_job = define_asset_job("my_job", AssetSelection.groups('my_group'), config={"execution": {"config": {"max_concurrent": 1, "dagster/max_retries": 3, "dagster/retry_strategy": "ALL_STEPS"}}}, partitions_def=my_partitions_def)
Nicholas Pezolano
05/26/2023, 4:05 PMowen
05/26/2023, 4:48 PMmy_job
has multiple steps executing at once? If so, do you mind sharing a screenshot of that run + whatever's available in the "View tags and config" button in the top right?
just to be more precise (apologies if you already are aware of this), that configuration purely applies within each run of the job, and limiting global op concurrency (across runs) is not currently supported. however, this is a feature that is being actively worked on and should be available soonNicholas Pezolano
05/26/2023, 9:34 PMNicholas Pezolano
05/26/2023, 9:39 PMNicholas Pezolano
05/26/2023, 9:39 PMNicholas Pezolano
05/26/2023, 9:40 PMNicholas Pezolano
05/26/2023, 10:35 PMowen
05/26/2023, 10:40 PMexecution:
config:
multiprocess:
max_concurrent: 1
(note the extra multiprocess
layer of nesting)
https://docs.dagster.io/guides/limiting-concurrency-in-data-pipelines#limiting-overall-concurrency-in-a-job