Koby Kilimnik
10/24/2021, 4:11 PMprod
mode and the other poc1
mode, each defined configuration and default env configuration for all of my resources, now it belongs to the repo which is a tad weirded abstractionwisechris
10/24/2021, 6:49 PM@pipeline(
mode_defs=[
ModeDefinition(name="prod", resource_defs={"a": ..., "b": ...})
]
)
def do_stuff():
...
my_preset = PresetDefinition(mode="prod", config={"resources": {"a": ... }})
Becomes a job like this:
@graph
def do_stuff():
...
do_stuff_job = do_stuff.to_job(name="do_stuff_prod", resource_defs={"a": ..., "b": ...}, config={"resources": {"a": ...}})
We can think of a pipeline as a computation graph with n different environments (1 for each mode). We can think of a job as a computation graph with a single environment. I think in some of our examples we show the to_job call happening within the repo definition, but that doesn't have to be the case. The job's definition is provided to the repo in the same way as the pipeline's definition is, but now it isn't enforced that all environments are provided to the same repository.
Apologies for the essay. Does that help at all?dinya
10/25/2021, 5:45 AMyaml
package directly? Smth like
import yaml
from dagster import file_relative_path
cfg_file = open(file_relative_path(_file_, "default_config.yaml"))
default_cfg = yaml.full_load(cfg_file)
@graph
def do_stuff():
...
do_stuff_job = do_stuff.to_job(name="do_stuff_prod", resource_defs={"a": ..., "b": ...}, config=default_cfg)
or utils like from dagster import config_from_files
Koby Kilimnik
10/25/2021, 7:10 AMchris
10/25/2021, 3:04 PMKoby Kilimnik
10/25/2021, 3:06 PM