hi all :slightly_smiling_face: I am trying to cre...
# ask-community
o
hi all 🙂 I am trying to create a downstream asset from a multipartition ed asset. however it seems to just try to load all partition for every partition of the downstream asset. the end goal is to reduce one of the partition dimensions but either using the new partition def or the upstreams partition def results in this all loading behaviour. here is the code using the upstream partition definition.
ReduceMapping
has infinite loops for both functions to be implemented and the code doesn't get stuck so doesn't seem to be being called. when I print the partitions being loaded in the iomanager it prints a list of all the partitions.
Copy code
feats = AssetsDefinition.from_op(
    features,
    partitions_def=features_partition_def,
    keys_by_input_name={
        'model': model_asset.asset_key,
        'source_dataset': dataset_asset.keys_by_output_name['empty_dataset'],
        'hyperparameters': hyperparameters_asset.asset_key,
        'target_pivot_table': pandas_pivot_table_asset.asset_key,
    },
    partition_mappings={
        'target_pivot_table': target_mapping,
        'model': MultiPartitionsMapping('model_source'),
        'source_dataset': MultiPartitionsMapping('model_source'),
        'hyperparameters': MultiPartitionsMapping('model_source'),
    },
    group_name=ASSET_GROUP,
    metadata_by_output_name={}
)

preds = AssetsDefinition.from_op(
    predictions,
    keys_by_input_name={
        'features': feats.asset_key
    },
    partitions_def=features_partition_def,
    group_name=ASSET_GROUP,
    partition_mappings={
        'features': ReduceMapping()
    }

)
ah, it's a bug in my io manager!