Tobias Macey
01/20/2022, 6:54 PMPresetDefinition
. The examples show how to handle that in the case that you have one being passed to preset_defs
in the @pipeline
decorator, but not a good example of handling a list of them. It seems that the answer is that I have to create multiple instances of the job to be able to pass it into the schedule with the appropriate config?sandy
01/20/2022, 7:01 PMTobias Macey
01/20/2022, 7:18 PMalex
01/20/2022, 7:49 PMjob
s in to separate repositories and only load the relevant repository per installationTobias Macey
01/20/2022, 9:08 PMpaul.q
01/20/2022, 11:27 PMGraphQL
requests that need to insert the repository location and name.
Our approach was to have the a config module that relies on the presence of a special environment variable (e.g. dagster_env
). At run time we can use this to determine our environment. We use YAML files like dev.yaml, prod.yaml, etc to provide the run_config, then we use something like this to ingest the appropriate YAML:
import pkg_resources
import yaml
def get_run_config(env):
yaml_file_name = f"{env}.yaml"
package_string = <your folder>
yaml_content = pkg_resources.resource_string(package_string, resource_name=yaml_file_name).decode("utf-8")
config = yaml.safe_load(yaml_content)
return config
In this way we can keep a consistently named repo whose name reflects the purpose of the jobs within.Tobias Macey
01/21/2022, 12:28 PM