Clement Emmanuel
04/17/2023, 11:32 PMexecute_pipeline_iterator
or execute_job
to launch a dagster run from within an op. I do this by passing the existing instance
which I get from the OpExecutionContext
of the parent op, this works and I can see the new run happening in the dagster UI, however i want to be able to control the execution of the child job, i.e. in a k8s context i want this to happen on a new pod, however by default this is happening on the same pod as the parent op. Is there something I can do here, or is there a different approach I should be taking?owen
04/18/2023, 3:52 PMexecute_job
python api is fairly literal -- it executes the job within the current process. Instead, you could consider using the dagster graphql python client. This allows you to submit a job to be executed, and is functionally identical to hitting the "launch run" button in the UI (and so will have identical execution behavior)Clement Emmanuel
04/18/2023, 3:54 PM