So, looking at the docs, the Dagster CLI supports ...
# announcements
t
So, looking at the docs, the Dagster CLI supports a
-c
flag for loading a config from YAML, but when I run the
--help
for Dagit, it doesn't show that as an option. Is that facility passed through to Dagster, or is that something that can be specified in the workspace.yaml?
@max do you happen to know the answer here?
m
tobias is what you're looking for the ability to load a yaml file directly into the config editor in dagit?
i think that's a great idea, but not implemented -- would love if you would open an issue for it
in the meantime, you can use presets for this
(if i understand what you're driving at)
t
Yeah, my use case is that I want to use my config management to write out the YAML files that specify the config needed by the solids, resources, etc. and then pass that path to dagit/dagster to load them so that the execution can start without me needing to populate any values in the dagit playground.
In https://docs.dagster.io/tutorial/basics_solids where it says how to specify config using the dagster CLI it mentions the
-c
flag to dagster for passing a yaml fragment, so I was looking for functionality similar to that
But, I'm deploying my pipeline using dagit, so it would be helpful if there were a way to pass that YAML fragment as an argument to dagit itself, or by specifying the path to the yaml fragment in the workspace config.
I had also looked at the preset definitions and was working on parsing the best way to handle loading the yaml files in that scenario. Will take another look now.
Is there a way to specify the preset to use within the workspace.yaml?
m
hmm, ok
what about starting the pipeline runs over graphql?
you can see the schema if you navigate to
/graphiql
on the dagit server
and you can pass arbitrary config there
t
That's a possibility. For now I'm working on adding some presets that use the
from_file
functionality and we'll see how that goes. Down the road I'm sure I'll be using the GraphQL functionality for triggering pipeline runs from things like S3 events.