Hello! My team is trying to set up a dynamically ...
# ask-community
a
Hello! My team is trying to set up a dynamically loaded
repository
with the
RepositoryData
class. I adapted the example from the Repository documentation (I'll post code in the comments). The problem: Despite configuring the dynamic-repo's
working_directory
in
workspace.yaml
Dagit has trouble finding the modules due to working-directory conflicts. More specifically, if a breakpoint is inserted inside
DynamicRepo.get_all_pipelines()
, the debugger tells me the working-directory hasn't changed from. Have I misunderstood the documentation? https://docs.dagster.io/concepts/repositories-workspaces/workspaces#loading-relative-imports
My code:
workspace.yaml
Copy code
load_from:
- python_file:
    relative_path: repo.py
    working_directory: /some/far/away/directory
repo.py
, located in a distant folder
Copy code
import importlib
from pathlib import Path
from dagster import repository, RepositoryData

class DynamicRepo(RepositoryData):
    def get_all_pipelines(self):
        return [
            self._construct_job_def_from_filename("dynamically_generated_job1.py"),
            self._construct_job_def_from_filename("dynamically_generated_job2.py")
        ]

    def _construct_job_def_from_filename(self, filename):
        # ... construct jobs by importing them as modules
        # If we break here and call Path.cwd(), we recieve the DAGIT_HOME as the working directory
        # I expected Path.cwd() to return "/some/far/away/directory"
        module_name = Path(filename).name
        module = importlib.import_module(module_name)
        return module.__dict__['main_job']