# ask-community

Alexander Whillas

08/18/2022, 3:07 AM
Setups for different environments? I'm trying to keep 1
for both DEV and PROD, but I need to have different
files for each (local machine vs AWS ECS). The
changes too (host name in ECS is
while its the container name in docker-compose locally). Also unsure how to just pull in the repos that I've setup separately for development/testing and production? Is there a recommended way / best practice for this?
dagster bot responded by community 1
Looking at the options here: • the `workspace.yml`can be passed as an arg in the container
for Dagit but no option for the daemon? (or is it not used?) • Would be nice if the
file could be passed as a command arg where ever it is used? • with the GPRC option can one specify the repos exported in the ``file to use? If there is mixed local and production versions all get exported atm afaik


08/18/2022, 6:51 AM
I’ll preface this by saying that this is based on my understanding of the system and is possibly (maybe even likely?) wrong, but I’m sure someone from Elementl will read and can correct whatever wrong assumptions I’m passing. 1. Your
should be in your
, you can have different ones in local and cloud environments and the local one doesn’t need to point to a postgres instance, doing so would mean it accesses the same event logs, schedule storage and run storage, which you probably don’t want (my local
file lives in
and only sets the log level as
, for example). I don’t develop locally with docker-compose though, usually spin up a dagit instance from inside the user code location/repository I’m currently working on, but you should be able to map your docker-compose to a different
2. I don’t completely understand what you’re doing with the
, but you can always have multiple files and pass the command, as you mentioned. 3. Running
dagster-daemon run --help
on my terminal does show the
flag, might be missing in the docs (didn’t check), but it does work. 4. Check out the fully featured project. It shows a pattern for dynamically setting the repository depending on the deployment. I think there’s also a way to specify the exact repo the workspace fetches if you have multiple decorated functions, but I can’t seem to find an example for that (and it’s not a pattern I’ve used)
Followup: just found the example I was looking for with the different repository functions: It’s from an old commit so maybe not to be seen as a “best practice”, but the
specifies a flag
that seems to map to the function. Here’s the docs on it.

Alexander Whillas

08/18/2022, 7:59 AM
hey, yeah thanks for the detailed reply. I should probably just run
locally but i like the local deployment to match the production as closely as possible so i'm doing multi container deploys locally too. My issue with the
file is that (a) you can't specify it in the command so you have to do it in the Dockerfile which means you can't reuse the same file for local and production (I have a hack that does it, i'l post it bellow). (b) i need to have 2 files as production uses the
run launcher and local uses
plus s3 storage for
etc. I've been working from the
example. Finally got it going but its awkward (see Dockerfile bellow):
Copy code
# Dagster
FROM python:3.9-slim as dagster
RUN apt-get update && apt-get install -y git
ENV DAGSTER_HOME /opt/dagster/dagster_home
COPY requirements-dagster.txt .
RUN pip install --no-cache-dir -r requirements-dagster.txt
COPY dags dags
COPY dagster.yaml ./
COPY workspaces workspaces
CMD ["dagster-daemon", "run"]

# Dagit
FROM dagster as dagit
COPY requirements-dagit.txt .
RUN pip install --no-cache-dir -r requirements-dagit.txt
CMD ["dagit", "-h", "", "-p", "3000", "-w", "workspaces/dev.yaml"]

FROM dagster as user_code
COPY requirements-dagster.txt .
RUN pip install --no-cache-dir -r requirements-dagster.txt
COPY tests tests
CMD ["dagster", "api", "grpc", "-h", "", "-p", "4000", "-m", "dags.repo"]

# basiclly a different dagster.yaml file

FROM dagster AS dagster_dev
COPY tmp/dagster-dev.yaml dagster.yaml

FROM dagit AS dagit_dev
COPY tmp/dagster-dev.yaml dagster.yaml

FROM user_code AS user_code_dev
COPY tmp/dagster-dev.yaml dagster.yaml
and in the docker-compose file I use the
targets and the other ones in CDK
But like i said, its a bit awkward