Hello again! I'm trying to set up a dockerized cel...
# announcements
m
Hello again! I'm trying to set up a dockerized celery deployment on a single machine: 1. I made a doker-compose with rabbitmq, dagit master and dagster celery workers containers 2. I had to use custom celery_config.yaml to overwrite broker url for internal docker network address 3. But I still want to use filesystem storage through mapping all my containers to a single shared volume on disk And when I try to launch my pipeline I get
dagster.check.CheckError: Invariant failed. Description: Must use S3 or GCS storage with non-local Celery broker: <amqp://guest>:guest@cube_rabbitmq:5672// and backend: rpc://
Any ideas?
And anyway why can't one use NFS to launch distributed celery execution?
s
Ah interesting. That’s just an arbitrary restriction we have to keep people from messing up. Too strict. Can you file issue we’ll fix this up quickly
m
sure
s
are you building off public version or off of master?
m
7.9 - I just pip install it inside my image (7.10 btw implicitly requires dagster_k8s and blows up if it isn't installed, I've made another issue on that) I've already started trying to deploy with local minio s3 storage and now fighting against 'Invalid endpoint' 😃