# ask-community

George Eddie (UK)

07/12/2023, 1:18 PM
Hello, I am currently trying to design an architecture around dagster which migrate an existing project but I am struggling with a key concept. If have an Asset DAG of assets/ops definitions which i need to be able to run for every document uploaded to blob storage before being "materialized" as a new individual record to our application elasticsearch db. I feel like using a DynamicPartition is not the best way of doing this


07/12/2023, 4:15 PM
I think dynamic partitions + sensors are arguably one way of doing this, but it depends on the volume of these records. If it’s too high volume, you might run into perf issues. If the asset doesn’t need to run immediately, and on every single record individually, another option is to chunk based on time instead, and use time-based partitions