Hi Everyone, I'm using build_reconstructable_pipel...
# ask-community
Hi Everyone, I'm using build_reconstructable_pipeline method to make my pipeline reconstructable in pythonic execution. Below is the code snippet and for that I'm getting a warning saying
UserWarning: Module builder was resolved using the working directory. The ability to load uninstalled modules from the working directory is deprecated and will be removed in a future release.  Please use the python-file based load arguments or install builder to your python environment.
Copy code
reconstructable_pipeline = build_reconstructable_pipeline(
Is there a way to pass module name here, coz it is expecting first 2 arguments as string.
Is this error message still valid? @daniel @prha
To clarify - its complaining that
was resolved only when using the current working directory and its not something thats installed in the current python environment
@Dylan Hunt what version of dagster are you using? This is something that changed recently (maybe
Yeah I understood that Dagster is using
to import that module then the function from it using second argument. But it is how it is designed then why it is asking me to use python-file based load arguments or install it as library. If it is possible I can directly pass the function object here.
@prha Yes I'm using
I believe that we just stopped issuing that warning, which was from a bygone error where this scenario would cause issues when supporting cron-based schedulers
I thought that we stopped issuing the warning in
, but it may have been
Is there any other way to build a reconstructable pipeline? This approach is consuming too much time and also using FS as IO causing latency issues
What is your overall goal? I think you may be looking for https://github.com/dagster-io/dagster/issues/4041 The reconstructable pipeline stuff is to enable loading the definitions in the subprocesses created for multiprocess execution FS IO is used in multiprocess execution since communication needs to happen across process boundaries.
@alex I'm trying to run dagster in pythonic way, where I pass the pipeline and config as yaml. For that I'm using construct_pipeline and passing the PipelineDefinition to execute_pipeline. It is working good now and it has 5 ops and runs for 8-9 seconds usually. In an attempt to parallelize the independent ops and reduce the over all execution time, I used build_reconstructable_pipeline and added mutiprocess in run_config using the FS as IO manager. Now it is running but this is running for 40-50 seconds in this approach. So I doubt the FS as IO manager and the way we importing the pipeline factory using importlib is resulting in the increased execution time. So I'm just thinking of some way to reduce timing and parallelize the execution of independent solids in python.