Hey, I have dagster deployed as a service in a CI/...
# ask-community
Hey, I have dagster deployed as a service in a CI/CD environment and I ran into a problem when trying to upgrade the python version. It looks like the dagster-daemon was reading an old version of repository configuration from the database, including a path to each repo’s python executable. When it was trying to check the schedules and sensors, it would try to start a subprocess with the old python executable and fail because it was no longer there. In order to get the schedules and sensors running again, I had to toggle off an old version of the repositories from the dagit-ui, then toggle a new version on. Is there a way to autopropagate workspace changes to the dagster-daemon upon deployment?
Hi Albert - do you possibly have the exact stack trace of the error that you were running into? I just want to check exactly where it was happening (i.e. was it when launching the run, or when checking the schedule)
Apologies, this is backwards
daemon caught an error for sensor ****** : FileNotFoundError: [Errno 2] No such file or directory: ‘/root/.pyenv/versions/3.9.5/envs/general/bin/python3.9’
raise child_exception_type(errno_num, err_msg, err_filename)
File “/root/.pyenv/versions/3.9.10/lib/python3.9/subprocess.py”, line 1821, in _execute_child
self._execute_child(args, executable, preexec_fn, close_fds,
File “/root/.pyenv/versions/3.9.10/lib/python3.9/subprocess.py”, line 951, in init
rv = old_popen_init(self, *a, **kw) # type: ignore
File “/root/.pyenv/versions/3.9.10/envs/general/lib/python3.9/site-packages/sentry_sdk/integrations/stdlib.py”, line 198, in sentry_patched_popen_init
return subprocess.Popen(parts, creationflags=creationflags, **kwargs)
File “/root/.pyenv/versions/3.9.10/envs/general/lib/python3.9/site-packages/dagster/serdes/ipc.py”, line 192, in open_ipc_subprocess
server_process = open_ipc_subprocess(subprocess_args)
File “/root/.pyenv/versions/3.9.10/envs/general/lib/python3.9/site-packages/dagster/grpc/server.py”, line 996, in open_server_process
self.server_process = open_server_process(
File “/root/.pyenv/versions/3.9.10/envs/general/lib/python3.9/site-packages/dagster/grpc/server.py”, line 1094, in init
server_process = GrpcServerProcess(
File “/root/.pyenv/versions/3.9.10/envs/general/lib/python3.9/site-packages/dagster/core/host_representation/grpc_server_registry.py”, line 176, in _get_grpc_endpoint
Stack Trace:
No problem. And am I right in thinking there was now a list of "Unloadable Sensors" in the dagit UI that you had to turn off?
got it - this is something we want to improve (we're actually making some changes in the release this week to make the daemon more 'workspace aware' to unblock this, so definitely on our radar), but for now you do need to turn off the sensors and turn them back on when there are underlying changes to your workspace like this. There's a bit more explanation in the docs here: https://docs.dagster.io/concepts/repositories-workspaces/workspaces#identifying-repository-locations
So that encompasses any changes to the workspace yaml other than new jobs?
Is there a dagit cli command I can use to toggle schedules and sensors programmatically?
Any changes to the workspace.yaml, yes. And yeah, there are
dagster schedule start
dagster schedule stop
CLI commands
np - hopefully we can smooth down this rough edge soon