Hey there, was wondering if there are ways to "yie...
# ask-community
a
Hey there, was wondering if there are ways to "yield" outputs instead of outputting all at once? So like single input, many outputs, but the many outputs can be processed individually by downstream tasks
o
the quick answer is yes! If you want to perform the same operation on each of those outputs (i.e. you want to process N files in the same way), then you'd probably want DyanmicOutputs: https://docs.dagster.io/concepts/ops-jobs-graphs/dynamic-graphs#a-dynamic-job. If each of those outputs is different in nature, you can do something like
Copy code
from dagster import op, Out, Output

@op(out={"some_output": Out(), "another_output": Out()})
def multiple_output_op(some_input):
    # ...
    yield Output(value=10, output_name="some_output")
    # ...
    yield Output(value="blah", output_name="another_output")
then, when you're using multiple_output_op in a graph, you can do
Copy code
@graph
def do_stuff():
    some_output, another_output = multiple_output_op(foo())
    do_something(some_output)
    do_another_thing(another_output)
a
Conversely is there a way to have an op listen to an "Iterator", that pipes in assets as they are occuring?
scenario would be some sort of aggregation operation or something that can start without all of the assets being present
o
there's no built-in way of doing that unfortunately -- ops/assets only start execution once all upstream outputs are available
a
cool thanks!