Hey hello everyone, In my pipeline at the moment ...
# ask-community
j
Hey hello everyone, In my pipeline at the moment I’m copying a lot of assets to 2 different place without doing any modification to the data since I’m required to always store a copy in S3. Is there a way I can save the same asset to 2 different location so I don’t have to duplicate all my source assets? Here’s an example of what I’m doing
Copy code
@asset(
    io_manager_key="s3_csv_io_manager",
    op_tags={"kind": "s3"},
)
def asset_to_s3(
    context: OpExecutionContext
):
    #get asset from somewhere
    return Output(master_monthly_summary_report)

@asset(
    io_manager_key="snowflake_io_manager",
    op_tags={"kind": "s3"},
)
def asset_to_snowflake(
    context: OpExecutionContext, asset_to_s3
):

    return Output(asset_to_s3)
I would like to do something like
Copy code
@asset(
    backup_io_manager_key="s3_csv_io_manager",
    io_manager_key="s3_csv_io_manager",
    op_tags={"kind": "s3"},
)
def asset_to_s3(
    context: OpExecutionContext
):
    #get asset from somewhere
    return Output(master_monthly_summary_report)
And store both the asset at both places at the same time. Is there a way to do that in Dagster?
b
Why not have a single io_manager that first writes data to s3, and then writes data to snowflake?
j
That’s a pretty straight forward and simple way of doing what I want to achieve. Well thank you very much, I was making this harder than it actually is.
😊 1
b
Hopefully it really is that straightforward. Good luck!