Q: What governs the "number of jobs steps run in p...
# deployment-kubernetes
m
Q: What governs the "number of jobs steps run in parallel" on K8s when using the CeleryK8sRunLauncher? Whilst I can start multiple (eg 4) pipeline runs concurrently; they only seem to be able to run a maximum of 2 concurrent job steps across all 4 running pipelines. Background: I have an autoscaling GKE cluster and a couple of scheduled Dagster pipelines running on it. Ideally I'd like Dagster to normally run with a minimal number of "persistent" pods; except when running a scheduled pipeline; which which case I want it to spin up as many pods as is possible (give pipeline solid parallelisation constraints).
a
only seem to be able to run a maximum of 2 concurrent job steps
I believe this lines up with the default that we spawn 2 workers for the default celery queue https://dagster.phacility.com/source/dagster/browse/master/helm/dagster/values.yaml$325-329 which is another dimension to consider in addition to
worker_concurrency