Has someone seen this error? I think i am struggli...
# ask-community
r
Has someone seen this error? I think i am struggling to disambiguate the op from the graph from the asset backed by the graph...
j
Hi @Rahul Dave what’s the output you’re trying to return from the notebook? you aren’t required to include the line
Copy code
dagstermill.yield_result(out, output_name="encoders")
in order to turn a notebook into an op
r
the notebook has a dictionary in
out
which i would like to save as a software designed asset, to consume in downstream ops
this is also why i am not using the output notebook as a SDA (i dont care for the output notebook, only the data in the
out
variable
j
ok that makes sense. so right now, dagstermill assets don’t support yielding results for reasons outlined in this github issue when you create a dagstermill op and wrap it in a graph and then turn that graph into an assets, it’s still basically turning the dagstermill op into a dagstermill asset, so then yield_result become unsupported. Adding yield_result support to the dagstermill asset is in my backlog, but i dont have a firm time when i’ll be able to work on it.
r
Interestingly even adding a downstream no-op did not affect this
Copy code
```
encoder_op = define_dagstermill_op(
    name="encoder_op",
    notebook_path=file_relative_path(__file__, "../notebooks/encoder.ipynb"),
    output_notebook_name="output_encoder",
    outs={"encoders": Out(dict)},
    ins={"df_train": In(pd.DataFrame), "df_test": In(pd.DataFrame)}
)

@op
def pass_thru(encoders):
    return encoders

@graph(out = {'result': GraphOut()})
def encoder_graph(df_train, df_test):
    encoders, _ = encoder_op(df_train, df_test)
    result =  pass_thru(encoders)
    return result

encoder_asset = AssetsDefinition.from_graph(encoder_graph,
    keys_by_input_name={"df_train": AssetKey("train_dataset"), "df_test": AssetKey("test_dataset")},
    keys_by_output_name={"result": AssetKey("encoders_asset")}
)
```
So the "assetization" is passed back through the graph!!
Is there a work-around? How do you suggest i approach this? Wrap papermill in a python executor using their python api? or use events..is there an example of this?
Since events will create AssetMaterializations I would lose the graph structure...
(BTW incase you are wondering, i am trying to wrap an entire machine learning pipeline into dagster)
j
AssetMaterialization events would work, but yes you’d loose the graph structure. For full support we need to make a version of
dagstermill_asset
that uses multi asset instead of a single asset
r
I guess the other option is to forgo assets and run as a graph of ops with the file-interchange done by convention
j
yeah that would also be an option
r
That does seem to lose the core SDA advantage of dagster though (although that is lost in event driven materialization too!)
j
yeah for sure. for what it’s worth, being able to yield results from asset notebooks is a feature i’d really like to work on, but i have higher priority features at the moment. l may be able to find some time to squeeze it in, but i can’t make any promises. If you want to join the #dagster-noteable channel, i’ll announce new dagstermill features there
❤️ 1
r
will join! Please let me know how i can help. I think i will go with ops and graphs for now and upgrade to the assets when you give me the green light
c
I just ran into this issue as well. And would love to see it resolved. Maybe a quick solution would be to not yield the resulting completed notebook?
j
The code change to do that might be as intensive as converting to a multi-asset. i can look into it, but i can’t make any promises
c
Thanks @jamie
j
i do want to reiterate that i want to make this feature! i just have some high priority things i’m working on right now that i can’t put on hold
r
Thanks @jamie!
c
@jamie have you been able to work this feature?
j
not yet, unfortunately. I recommend following the github issue i linked earlier in the thread for updates!
r
Lemme add my voice to the chorus for the issue! There is a whole set of apps this will unleash!
c
@jamie oops sorry, I will watch the issue!
j
no worries! easy to get lost so far up the thread