https://dagster.io/ logo
#ask-community
Title
# ask-community
d

Dong Kim

02/16/2023, 8:36 PM
Hi, I am trying to integrate in-house applications as part of dagster workflow. A way I am currently considering is that an ops submits a job to celery workers through message queue. When a celery worker completes the task, it submit a result metadata to another queue where a dagster sensor can run on and returns another run request. Is this a possible workflow using the dagster or am I going down to the wrong path?
o

owen

02/16/2023, 9:33 PM
hi @Dong Kim! what you described would be possible, but I think there's likely a simpler implementation for your use case. Dagster has a pre-built
celery_executor
/
celery_k8s_executor
which will launch each op as a separate celery task . This means that your op can just be regular python code (allowing you to test it locally), and dagster will handle submitting work to the queue. Here are some docs to help you evaluate if this would work for you: https://docs.dagster.io/_apidocs/libraries/dagster-celery https://docs.dagster.io/deployment/guides/kubernetes/deploying-with-helm-advanced