https://dagster.io/ logo
#ask-community
Title
# ask-community
a

Abhinav Ayalur

12/05/2022, 6:50 PM
Hey there, was wondering if there are ways to "yield" outputs instead of outputting all at once? So like single input, many outputs, but the many outputs can be processed individually by downstream tasks
o

owen

12/05/2022, 6:55 PM
the quick answer is yes! If you want to perform the same operation on each of those outputs (i.e. you want to process N files in the same way), then you'd probably want DyanmicOutputs: https://docs.dagster.io/concepts/ops-jobs-graphs/dynamic-graphs#a-dynamic-job. If each of those outputs is different in nature, you can do something like
Copy code
from dagster import op, Out, Output

@op(out={"some_output": Out(), "another_output": Out()})
def multiple_output_op(some_input):
    # ...
    yield Output(value=10, output_name="some_output")
    # ...
    yield Output(value="blah", output_name="another_output")
then, when you're using multiple_output_op in a graph, you can do
Copy code
@graph
def do_stuff():
    some_output, another_output = multiple_output_op(foo())
    do_something(some_output)
    do_another_thing(another_output)
a

Abhinav Ayalur

12/05/2022, 7:11 PM
Conversely is there a way to have an op listen to an "Iterator", that pipes in assets as they are occuring?
scenario would be some sort of aggregation operation or something that can start without all of the assets being present
o

owen

12/05/2022, 7:15 PM
there's no built-in way of doing that unfortunately -- ops/assets only start execution once all upstream outputs are available
a

Abhinav Ayalur

12/05/2022, 7:16 PM
cool thanks!
3 Views