https://dagster.io/ logo
#ask-community
Title
# ask-community
m

martin o leary

07/18/2023, 10:23 AM
Hey Team, I am running a local instance of dagit in a dev container and for some reason my back-fills only run 1 at a time even with the executor settings set to allow concurrency as per:
Copy code
run_queue:
  max_concurrent_runs: 6
  tag_concurrency_limits:
    - key: "dagster/backfill"
      limit: 4
I'm wondering if Docker might be constraining what resources are available to dagit maybe? Anyone seen anything like this before?
Here is a back-fill run with only a single job executing at a time and my deployment Daemons
Oh and version of dagster:
Copy code
name         : dagster                                                 
 version      : 1.3.14                                                  
 description  : The data orchestration platform built for productivity.
Ok so I removed the
tag_concurrency_limits
key from my
run_quue
config and it seems to have worked now
c

claire

07/18/2023, 7:16 PM
Hi Martin. Yep, so that
dagster/backfill
tag is applied to each run that is created via a backfill. So setting a limit of
4
there would enforce that a maximum of 4 runs triggered via a backfill can be in progress at any given time.
m

martin o leary

07/18/2023, 7:49 PM
Cheers @claire - I was only getting 1 run at a time (I could have made that clearer 😀 ) with that set though.
j

Julien DEBLANDER

08/16/2023, 4:36 PM
Hi @claire, Can we restrict some backfill to run with a lower limit? It seems that the dagster/backfill tag overwrites all other tags. In my dagster.yaml file I set:
Copy code
run_queue:
    max_concurrent_runs: 3
    tag_concurrency_limits:
      - key: "dagster/backfill"
        limit: 3
      - key: "dbt"
        value: "incremental"
        limit: 1
While defining the asset job I write the tag:
Copy code
tags = {"dbt": "incremental"}
But still, backfills run 3 by 3. Any thoughts on this? Thanks
2 Views