Hello Dagster, I'm running `dagit`, `daemon` and ...
# ask-community
a
Hello Dagster, I'm running
dagit
,
daemon
and one
user-code
each in a Docker container. I use the default RunLauncher and Executor. Where exactly does the job and op computing takes place, in what container are the Python processes created • When I launch a job via Dagit? • When a job is requested by sensor or schedule?
m
I don't have the answer to your question, but I have a suggestion.. You can just create a dummy job like sleep(10000) and check the processes of your containers to find your answer.
In fact, you can also just create a file as part of your job so that you don't even have to check the processes. Something like
Path("/find_me").touch()
should do the trick.
a
Indeed I can do that, that's a very good point 🙏
d
Hi Antoine - in both cases they’ll execute in your user code container
🙏 1
a
Hello @daniel, Thanks, very clear ! But the sensor and schedule code that request the run will run in the
daemon
, right?
d
Dagit and the daemon will never run your code actually - that happens in the user code container too
🙏 1
a
Ok, so any credentials through env vars (like AWS) should only be in
user-code
And
dagit
and
daemon
don't use the Dagster repository (the code), only the gRPC connection to
user-code
.
Does it mean that a job with a config with a
SourceString
pointing to an env var that I launch through
dagit
will use the env var in the
user-code
and not from
dagit
?
d
that's right
a
Ok, thanks a ton 🙏