https://dagster.io/ logo
#announcements
Title
# announcements
c

cvb

04/08/2021, 11:13 AM
Hi guys, I'm using multiprocess executor and launching my pipeline from the jupyter notebook, multiprocess requires me to make my pipeline reconstructable
dagster.reconstructable(mypipeline)
. It works fine, except now I can't reload my pipeline without restarting python, somehow
reconstructable
always returns the same pipeline, any ideas how to force it to reload? Here is what I mean with examples in ipython:
Copy code
In [1]: import tst_pipe

In [2]: import dagster
In [4]: orig_solids = [s.name for s in tst_pipe.pipe.solids]
   ...: recons = [s.name for s in dagster.reconstructable(tst_pipe.pipe).get_definition().solids]
   ...: print('original', orig_solids)
   ...: print('recons', recons)
original ['c', 'b', 'a']
recons ['c', 'b', 'a']
looks fine, now I add new step to my file and relaunch the same code:
Copy code
In [5]: orig_solids = [s.name for s in tst_pipe.pipe.solids]
   ...: recons = [s.name for s in dagster.reconstructable(tst_pipe.pipe).get_definition().solids]
   ...: print('original', orig_solids)
   ...: print('recons', recons)
original ['c', 'b', 'd', 'a']
recons ['c', 'b', 'a']
a

alex

04/08/2021, 5:47 PM
get_definition
is using
@lru_cache
so I believe you can use
cache_clear
on it https://stackoverflow.com/questions/37653784/how-do-i-use-cache-clear-on-python-functools-lru-cache
c

cvb

04/09/2021, 6:51 AM
it works, thanks! had to clear cache for the repository and pipeline
Copy code
dagster.reconstructable(tst_pipe.pipe).repository.get_definition.cache_clear()

dagster.reconstructable(tst_pipe.pipe).get_definition.cache_clear()