Dinis Rodrigues
01/22/2023, 9:39 PMexecute_job
to test my jobs.
I'm using databricks step launcher, and it requires a reconstructable pipeline, so I'm unable to use job.execute_in_process
.
From the docs I'm doing:
instance = DagsterInstance.get()
execute_job(reconstructable(my_job), run_config=config, instance=instance)
But I get this error:
dagster._check.CheckError: Failure condition: Unexpected return value from child process <class 'collections.ChildProcessStartEvent'>
Stack Trace:
File "/opt/conda/envs/dagster_env/lib/python3.9/site-packages/dagster/_core/execution/api.py", line 991, in pipeline_execution_iterator
for event in pipeline_context.executor.execute(pipeline_context, execution_plan):
File "/opt/conda/envs/dagster_env/lib/python3.9/site-packages/dagster/_core/executor/multiprocess.py", line 240, in execute
event_or_none = next(step_iter)
File "/opt/conda/envs/dagster_env/lib/python3.9/site-packages/dagster/_core/executor/multiprocess.py", line 364, in execute_step_out_of_process
check.failed("Unexpected return value from child process {}".format(type(ret)))
File "/opt/conda/envs/dagster_env/lib/python3.9/site-packages/dagster/_check/__init__.py", line 1687, in failed
raise CheckError(f"Failure condition: {desc}")
Am I missing something?yuhan
01/24/2023, 1:04 AMop_retry_policy
?daniel
01/24/2023, 1:07 AMif __name__ == '__main__'
block is also executing within the step launcher which is probably not what you wantDinis Rodrigues
01/24/2023, 4:09 PMdaniel
01/24/2023, 4:13 PMDinis Rodrigues
01/24/2023, 4:30 PMdaniel
01/24/2023, 4:34 PMDinis Rodrigues
01/24/2023, 4:40 PMdaniel
01/24/2023, 4:52 PMDinis Rodrigues
01/24/2023, 5:19 PMdagster job execute
instead of launch. But I'm still unable to get a workaround on inserting the run_config directly in the decorator. The reason being my run config depends on environment variables. So when this goes to databricks, it tries to read environment variables that don't yet exist on the cluster