Hi! Before 1.0 API we were using multi process exe...
# ask-community
e
Hi! Before 1.0 API we were using multi process execution through the API (we create dynamic run configurations and we get the execution result to be integrated in other tools) and by leveraging execute_pipeline function. This function is now legacy, moving to Jobs instead of Pipelines, but what is the replacement for that (I can only see a job.execute_in_process that would not make a multi-process execution)?
a
We are still working on the replacement for that API. For now you can pass a
job
to
execute_pipeline
and it will work just as it used to.
As one of the users looking for this functionality it may be interesting to get some context on your use case as we design its replacement cc @chris
e
Yes, this is what I'm doing now (but was concerned by the "legacy" status of this: will wait 😉) regarding the context: our overall context is Azure Machine Learning Services. We are using Dagster for "unit" jobs (Data Engineering or Data Science) since it brings us a lot of value to link the basic together, clearly separate code from configuration/IOs and so on. In a way as Python enhancements. Then, those jobs are triggered from AMLS pipelines/experiments (with dynamic configurations) and the result (event list) are transformed into AMLS events (especially for materialization), hence the need for API path.
Thanks!
c
Thanks for the insight into your use case @Eric Cheminot! We're trying to figure out what the right design is for a python API that would replace
execute_pipeline
.
👍 1