Hello, I have a question about partitioned assets ...
# ask-community
l
Hello, I have a question about partitioned assets and I haven’t been able to find a good answer: What’s the best way to deal with partitioned assets and dependent assets? Let’s assume I have a daily dump that I marked as a daily partitioned asset. I have dependent assets/ops that “just” need to re-run when a new dump is available. Can I just target the latest dump? Or do I need to make all the downstream jobs partitioned as well? Thank you!
c
It really depends on how you model it - but generally speaking downstream daily partitioned assets should still just run when the new dump is available
l
Thank you; I’m not sure how to achieve that, right now I get this error, which seems to indicate that all the partitions are passed to the down to the dependent asset, instead of just the latest
Copy code
dagster._check.CheckError: Failure condition: Loading an input that corresponds to multiple partitions, but the type annotation on the op input is not a dict, Dict, Mapping, or Any: is '<class 'bytes'>'.
Do you know good resources about modeling with Dagster? I’m new to the product and while the documentation is good, it lacks complex examples 😅
okay, so perhaps I can leverage
dagster.LastPartitionMapping
also, I’m not sure my upstream asset should really be partitioned in my case, or if should just be “updated” by a sensor