setup max concurrent steps with celery k8s exectuo...
# ask-ai
b
setup max concurrent steps with celery k8s exectuor for dagster
s
To set up maximum concurrency for steps when using the
celery_k8s_job_executor
with Dagster, you can use the
tag_concurrency_limits
configuration within your job definition. This allows you to specify concurrency limits for certain tags, which can be applied to steps within your jobs. Here is an example of how you might configure a job to limit the concurrency of steps that share a specific tag:
Copy code
python @job(     config={         "execution": {             "config": {                 "celery_k8s_job_executor": {                     "config": {                         "tag_concurrency_limits": [                             {                                 "key": "database",                                 "value": "redshift",                                 "limit": 2,                             }                         ],                     },                 },             }         }     } ) def my_job():     ...
In this example, steps tagged with
database: redshift
would be limited to a maximum concurrency of 2. You would apply these tags to your steps (now referred to as assets in the latest versions of Dagster) to enforce the concurrency limits. Please note that these limits are applied on a per-run basis. If you need to apply concurrency limits across multiple runs, you would need to use the
celery_executor
or
celery_k8s_job_executor
with the appropriate configuration. For more detailed information and examples, you should refer to the Dagster documentation, specifically the guides on limiting concurrency in data pipelines and the API documentation for the
celery_k8s_job_executor
.