Is there a way to limit the number of queued jobs ...
# ask-ai
j
Is there a way to limit the number of queued jobs in the
run_coordinator
setup ?
s
Yes, there is a way to limit the number of queued jobs in the
run_coordinator
setup. You can configure the
QueuedRunCoordinator
with a
max_concurrent_runs
parameter to specify the maximum number of runs that are allowed to be in progress at once. The default value for
max_concurrent_runs
is 10, but you can set it to -1 to disable the limit, to 0 to stop any runs from launching, or to any positive integer to set a specific limit. Any other negative values are disallowed. Here is an example of how you might configure the
QueuedRunCoordinator
in your `dagster.yaml`:
Copy code
yaml
run_coordinator:
  module: dagster.core.run_coordinator
  class: QueuedRunCoordinator
  config:
    max_concurrent_runs: 10  # Set your desired limit here
This configuration will ensure that no more than the specified number of runs will be in progress at the same time. If the limit is reached, additional runs will be queued until some of the in-progress runs are completed. How to ensure that only one run of a par... button How to limit the number of jobs running ... button K8sRunLauncher and QueuedRunCoordinator ... button QueuedRunCoordinator configuration button Thank you for your feedback!
j
I have setup
max_concurrent_runs: 20
yet I currently have the same job queued 1000 times Here is my full config:
Copy code
run_coordinator:
  module: dagster._core.run_coordinator
  class: QueuedRunCoordinator
  config:
    dequeue_num_workers: 4
    dequeue_use_threads: true
    max_concurrent_runs: 20
    tag_concurrency_limits:
    - key: dagster/image
      limit: 1
      value:
        applyLimitPerUniqueValue: true
    - key: dagster/schedule_name
      limit: 1
      value:
        applyLimitPerUniqueValue: true
What is the mistake ?