Anthony Reksoatmodjo
08/30/2022, 11:26 PMrepository
with the RepositoryData
class. I adapted the example from the Repository documentation (I'll post code in the comments).
The problem: Despite configuring the dynamic-repo's working_directory
in workspace.yaml
Dagit has trouble finding the modules due to working-directory conflicts.
More specifically, if a breakpoint is inserted inside DynamicRepo.get_all_pipelines()
, the debugger tells me the working-directory hasn't changed from.
Have I misunderstood the documentation? https://docs.dagster.io/concepts/repositories-workspaces/workspaces#loading-relative-importsAnthony Reksoatmodjo
08/30/2022, 11:28 PMworkspace.yaml
load_from:
- python_file:
relative_path: repo.py
working_directory: /some/far/away/directory
repo.py
, located in a distant folder
import importlib
from pathlib import Path
from dagster import repository, RepositoryData
class DynamicRepo(RepositoryData):
def get_all_pipelines(self):
return [
self._construct_job_def_from_filename("dynamically_generated_job1.py"),
self._construct_job_def_from_filename("dynamically_generated_job2.py")
]
def _construct_job_def_from_filename(self, filename):
# ... construct jobs by importing them as modules
# If we break here and call Path.cwd(), we recieve the DAGIT_HOME as the working directory
# I expected Path.cwd() to return "/some/far/away/directory"
module_name = Path(filename).name
module = importlib.import_module(module_name)
return module.__dict__['main_job']