Is this setting max_concurrent this way still supp...
# ask-community
n
Is this setting max_concurrent this way still supported? I jumped a few versions to 1.3.6 and I noticed multiple assets being run concurrently for the job group instead of just 1
Copy code
my_job = define_asset_job("my_job", AssetSelection.groups('my_group'), config={"execution": {"config": {"max_concurrent": 1, "dagster/max_retries": 3, "dagster/retry_strategy": "ALL_STEPS"}}}, partitions_def=my_partitions_def)
It looks like config is not being defined from here in the dagit console I don't see any config for my asset definitions in that asset group
o
hi @Nicholas Pezolano! to be clear, when you say "multiple assets being run concurrently for the job group", do you mean that a single run of
my_job
has multiple steps executing at once? If so, do you mind sharing a screenshot of that run + whatever's available in the "View tags and config" button in the top right? just to be more precise (apologies if you already are aware of this), that configuration purely applies within each run of the job, and limiting global op concurrency (across runs) is not currently supported. however, this is a feature that is being actively worked on and should be available soon
D 1
n
Sure, it's first case, i want to limit the concurrency within that job group if possible not globally within that run I see it running multiple assets at once, I'll send a screenshot of that as well for this job once it runs at 6:30
I see the config in the run when i click view tags and config, it's in the assets page of the job where I don't see the config sorry I mixed them up before
image.png
image.png
@owen Here's a screenshot in the scheduled run not following the concurrent limit
o
hm I think you're conflating tags and config when configuring your job here (which is fair, as they're very similar). dagster/max_retries and dagster/retry_strategy are both intended to be configured via the tags on the job, rather than through the config (see: https://docs.dagster.io/deployment/run-retries) for max concurrent, that is a configurable value, but that should go under
Copy code
execution:
  config:
    multiprocess:
      max_concurrent: 1
(note the extra
multiprocess
layer of nesting) https://docs.dagster.io/guides/limiting-concurrency-in-data-pipelines#limiting-overall-concurrency-in-a-job
D 1