:wave: Hello, team! can you please help me overco...
# ask-community
h
👋 Hello, team! can you please help me overcome the below issue i have an sensor [my_sensor] defined on a job [my_job]. [my_job] executes on op [my_op] which requires an input [filename] from sensor which i am passing from sensor using
Copy code
yield RunRequest(
    run_key=None,
    run_config=RunConfig(
        ops={"my_op": FileConfig(filename="customer_name")}
    ),
)
[my_op] returns an dynamic output. I want to process all the dynamic outputs sequentially not in parallel. I am trying with the below config on job but still the dynamic output mappings are running in parallel
Copy code
@job(
    config={
        "ops": {"my_op": {"config": {"filename": "my_sensor"}}},
        "execution": {
            "config": {
                "multiprocess": {
                    "max_concurrent": 1
                }
            }
        }
    }
)
how can i achieve it?
t
I had a similar question before (see post above). You should probably wait for an answer from someone with more in depth knowledge, but I think what you can do is the following. Add a concurrency limit on a specified tag in your dagster.yaml and then add the tag to your job so that these separate runs from the same op for different files all share the same tag and are hence limited by this global tag concurrency limit. e.g. in the dagster.yaml:
runCoordinator:
enabled: true
type: QueuedRunCoordinator
config:
queuedRunCoordinator:
tagConcurrencyLimits:
- key: "your_tag_key"
value: "your_tag_value"
limit: 1
h
Thanks @Tim Weelinck i tried this approach already but that didn't work
z
Would the in-process executor work for your needs? I think you can also configure it on a job like this:
Copy code
from dagster import in_process_executor, job

@job(executor_def=in_process_executor)
def a_job():
    some_op.map(another_op)
h
Thanks..! @Zachary Romer this worked by doing a small change how i pass input to op from sensor