Hi all ! I have an unpartitioned asset that builds...
# ask-community
j
Hi all ! I have an unpartitioned asset that builds multiple pandas dataframes, I would like each dataframe to then be written to an S3 bucket. My code looks like this but I can't implement anything that works, I get the following error:
Compute for op "…" yielded a DynamicOutput, but did not use DynamicOutputDefinition.
Is there anything I don't understand here? Or something I am forgetting? Thanks in advance
Copy code
@asset(io_manager_key=my_io_manager)
  def my_asset(context) -> DynamicOutputDefinition:
     for data_name in ['data_1','data_2','data_3]:
            yield DynamicOutput(build_data(data_name), mapping_key = data_name)
 
                  
class MyIOManager(IOManager):
  def __init__(self):
    path = "…"
  def handle_output(self, context: OutputContext, obj: pd.DataFrame):
    obj.to_parquet(path=path)
  def load_input(self, context: InputContext):
    pass
 
@io_manager
def my_io_manager(init_context):
  return MyIOManager()
o
hi @Jordan, unfortunately Software Defined Assets don't support Dynamic Outputs. The simple explanation for why is that if the Asset is truly Defined by the Software, then there shouldn't be any runtime computations to determine what asset(s) will be produced by running a given bit of code.
If the list ['data_1', ...] is known ahead of time / won't change based on runtime factors, then you don't actually need Dynamic Outputs, and I'd instead model this code as a
multi_asset
:
Copy code
@multi_asset(outs={"data_1": Out(io_manager_key="my_io_manager"), "data_2": Out(...), "data_3": Out(...)}
def my_data_assets(context):
    yield Output(value=build_data("data_1"), name="data_1")
    yield Output(value=build_data("data_2"), name="data_2")
    ...
if that list is not knowable when you're defining the asset (i.e. it requires calling out to some external service), then I'd recommend using an op-based approach (where dynamic outputs would be the correct approach)