Okay, I’m starting with a silly question but I can...
# ask-community
t
Okay, I’m starting with a silly question but I can’t see it in the docs and don’t want to do the full setup to find out… If I run a job via
my_job.execute_in_process()
, will that job be shown in the Dagit UI? This happens in Prefect but I can’t tell if the same is true with Dagster.
s
No, in Dagster's case the script can be totally decoupled from Dagit / Dagster-Daemon, so that it doesn't appear there.
execute_in_process
is commonly used for testing scenarios, when you want to simulate the code being run via the UI/daemon, without the actual overhead of the UI/daemon.
t
Ah okay, thanks for the info. Is there any mechanism to create jobs on the fly that will show in the UI/be schedulable? I’ve got user submitted pipeline configs that I need to turn into jobs. I love the look of Dagster but not sure it meets that use case
j
This will send a graphql request to Dagit, so the run will be stored and appear in the UI
t
@johann does the job have to be pre-defined in the above case?
As in could I construct one in code at runtime and then submit and run that newly created job definition?
j
Ah I missed that. That query expects that the job is already in the workspace. It make sense to have some form of build step that creates the job and puts it in the workspace. cc @sandy
s
@Tom Manterfield is there a core "process" that just needs to be configurable on submission? in that case, you can vary the
run_config
for the submitted job and it will execute. one example for us is that we have a single "meltano" job for all etl processes, but a number of different schedules that configure the job for different loads
t
It’s really quite dynamic. Users could be configuring completely different actions to run (think like a github actions/circle CI style YAML, so very open ended).
s
oh, gotcha, that's cool!