Hey everyone! :wave: I have a quick question about...
# announcements
s
Hey everyone! 👋 I have a quick question about dynamic orchestration. I have the following pipeline, which simply gathers “files from inside of files” and indexes them into a database:
Copy code
@pipeline
def ingestion_pipeline():
    def ingest(file):
        ingest_file(path=file)

    def spread(file):
        gather_files(path=file).map(ingest)

    gather_files().map(spread)
When I execute this pipeline, I get the following:
Copy code
Solid "ingest_file" cannot be downstream of more than one dynamic output. It is downstream of both "gather_files" and "gather_files_2"
As I understand it, nesting dynamic outputs is not currently implemented. Alright, so instead of mapping from inside the
spread
function, I tried to map over its result, like this:
Copy code
@pipeline
def ingestion_pipeline():
    def ingest(file):
        ingest_file(path=file)

    def spread(file):
        return gather_files(path=file)

    spread_files = gather_files().map(spread)
    spread_files.map(ingest)
It does not work either, which makes sense since the underlying dependency graph should be the same. Does anyone have tips on how to overcome this kind of situation using Dagster?
a
ya an unfortunate limitation that should get removed in the future
files from inside of files
are these directories? im not sure exactly what this means Eitherway i think the workaround is to have the body of the
gather_files
solid which is emitting the dynamic outputs to do the recursion in to the “files that contain other files” and emit the whole set straight away - if thats possible
s
They are zip archives, hence the extra step. Thanks for the answer. Do you have any idea when this will be implemented ?
a
not at this time - focused on implementing
collect
for this upcoming release
👍 1
s
Also a wonderful improvement 👍
Still trying to overcome this in a "proper" way, is there any update on nesting dynamically generated pipelines as of 0.11.6 ?
a
no update at this time
👍 1