https://dagster.io/ logo
#ask-community
Title
# ask-community
b

Bart Reijmer

07/31/2023, 5:23 PM
Hi everyone, I have a job with about 1200 ops that take anywhere from 1-5 seconds to complete. Running these sequentially takes too long for our use case and using the multi process executor has too much overhead to improve the execution time. Any ideas for introducing concurrency? I was looking at the dask executor but I would prefer to staying away from learning and setting up a new technology. Any ideas would be helpful
t

Tim Castillo

07/31/2023, 7:36 PM
Hi, we usually recommend leveraging Dask/Celery to manage that concurrency. If you don't want to use them, you might get some success with the in_process_executor, depending on the type of computations being done per op, though the answer is likely that it won't scale well.
b

Bart Reijmer

07/31/2023, 10:31 PM
thanks @Tim Castillo