Liezl Puzon
04/30/2022, 7:20 AMtag_concurrency_limits
+ some clever chunking of my 5000 dynamic ops across ~100 multiple job runs might work for this, but it feels hacky to me.
~I have a dagster job that has dynamic ops (spins up k8s_executor
). Is there a way to specify a concurrency limit X so that a given job run will only schedule X parallel k8s jobs at a time?prha
05/02/2022, 5:54 PMLiezl Puzon
05/02/2022, 6:00 PMprha
05/02/2022, 6:06 PMcelery_k8s_job_executor
? https://docs.dagster.io/_apidocs/libraries/dagster-celery-k8s#dagster_celery_k8s.celery_k8s_job_executor. You could set up queues according to the concurrency constraints you have:
https://docs.dagster.io/deployment/guides/kubernetes/deploying-with-helm-advancedLiezl Puzon
05/03/2022, 10:05 PMdaniel
05/03/2022, 10:44 PM