Airton Neto

12/01/2022, 5:46 PM
Hi there, starting just following up what we've discussed in the demo meeting The two use-cases I was talking about: 1) I'm fetching daily data from Global Forecast server since 2021-01-01. To do that, I use just one multi-paritioned asset with two keys. the second key assuming 81 values. That said, I would have to materialize ~56K (no 48, my mistake kkk) assets, which I suppose I could do by batches. 2) That is not my current problem, but another IoT common situation is having to mine a bunch of devices (let us pick 100, for example), each one having a bunch of operating signals, like temperatures, current, voltages, etc (50 signals). One could want to mine all this data and stream some calculation, what would yield 5K materializations
You guys suggested some ways to deal with it: GraphQL API, looking at my clusters resources, or using Dagster's Sensors. I will see about these things
@ben @Fraser Marlow thanks for the support!
👍 1

Fraser Marlow

12/01/2022, 6:28 PM
@owen - should you want to pick up the thread!

Airton Neto

12/01/2022, 6:37 PM
In fact, that run_coordinator totally solved this issue
❤️ 1
The queue did not receive all tasks at once, but they are still coming to the queue and them materialezed in runs, limited to 25 concurrent materializations
😛artyblob: 1