pdpark
07/29/2021, 6:37 PM@repository
def prod_repo():
run_config = get_params()
return [
my_graph.to_job(
config={
"solids": {
"do_stuff": {
"config": {
"a_param": run_config.get("a_param")
}
}
}
}
)
do_stuff
is a “mapped” op downstream from a`DynamicOutputDefinition` op.
dynamic_output_op().map(do_stuff)
do_stuff
is called a_param
is different - I was expecting the value to be the same…?alex
07/29/2021, 6:46 PMget_params
?pdpark
07/29/2021, 6:48 PMalex
07/29/2021, 6:51 PMpdpark
07/29/2021, 6:53 PMalex
07/29/2021, 6:56 PMget_params
to get the run_config that the schedule submits
I think this would actually work from dagit as well - since there it would take this blob config - load it in to the editor - and then submit that explicit version
I bet in the dagster CLI it would work too if you did --preset default
which would cause the same “submit explicit copy of config” behavior
You are currently getting burned since the jobs “default config” when no explicit run config is provided is part of its in memory definition and is changing as its reloaded in each processpdpark
07/29/2021, 7:02 PMdagster pipeline launch \
--pipeline {graph} \
--workspace work_{env}/workspace.yaml
--preset default
to this command?alex
07/29/2021, 7:05 PMpdpark
07/29/2021, 7:06 PMalex
07/29/2021, 7:30 PMsandy
07/29/2021, 8:05 PMfrom dagster import ConfigMapping
def config_fn():
current_time = datetime.now()
return {
"solids": {
"do_stuff": {
"config": {
"a_param": current_time
}
}
}
}
@repository
def prod_repo():
return [
my_graph.to_job(
config=ConfigMapping(config_fn=config_fn)
)
pdpark
07/29/2021, 8:06 PMsandy
07/29/2021, 8:48 PMpdpark
07/30/2021, 1:55 PMmake_values_resource
. I’m only specifying the resource value in the config file, not any solid config values, like this:
resources:
values:
config:
graph_run_dts: "20210730T111848.842Z"
Getting this error:
dagster.core.errors.DagsterUserCodeProcessError: dagster.core.errors.DagsterInvalidConfigError: Error in config mapping for pipeline part_graph mode default
Error 1: Received unexpected config entry "resources" at the root.
graph
and ops
) that require config values:
a_graph.to_job(config={...})
If I supply a config file, can I just include resources or do I have to include all the config values as well, which I’m currently setting in the to_job
call?alex
08/02/2021, 2:02 PMrequired_resource_keys
on the ops for the values
resource?pdpark
08/02/2021, 10:07 PMrequired_resource_keys
, but I actually refactored this, so it’s not an issue for me now. I had some values I was passing to several of my `op`s - such as the env (prod, dev) - that I moved to to a values
resource.