Hi! I made a small example of dagster integration ...
# announcements
c
Hi! I made a small example of dagster integration with celery and docker incorporating external intermediate storage and also the DB. Mainly as an exercise to understand how to achieve a full configuration with celery. I would like to ask wether you could give me advise on how to optimize the processes. It would be nice to know how to improve this structure to form a base or template for a more serious deployment. Code here https://github.com/astenuz/dagster-celery-test
a
This looks really solid - nice job!
does
--scale dagster-worker=N
work?
c
Thanks! and yes, adding that to
docker-compose up
works
a
Dealing with the compute log manager is a bit awkward - but we dont really have a good thing to slot in
c
Ah ok, that could also be handled with s3 right?
a
Yep what you have is as good as you can do - just leave it commented out since people should opt in with their own settings
c
alright, thanks!
I also wanted to ask you, does it makes sense to have the same image for dagit and the workers? could there be a way to segment this?
a
Its the best thing to do for now - were working on supporting dagit fetching metadata from a separate container for
0.9.0