For a run - is the data about how long each asset ...
# ask-community
c
For a run - is the data about how long each asset materialization took available for download or copy-as-markdown?
t
I know you just asked about the Postgres db further down, but to answer this original question: Yeah! You can either dump it out of the Postgres db, but I'd recommend using the GraphQL API to get this data out, as that'll be more resilient to changes that the Dagster team makes to our underlying databases.
❤️ 1
c
ok, thanks! I'm trying to run an A/B test with warehouse sizes on snowflake and asset materialization time
🙂
t
I was just tinkering around with something over the weekend! I wanted to get all my dbt models and see how the execution time fluctuated over time (threads, warehouse size, materializations, etc.), so I understand the interest and curiosity!
❤️ 1
Have you heard of select.dev? Can't speak to the product, but their blog is amazing at helping give you metadata about your snowflake costs. Disclaimer that I used to work with one of their founders 😅
c
I'm in uh.... "negotiations" ... with my IT department to let me provision an XL warehouse that I use for burst compute jobs overnight and which is suspended upon their completion. They seem to be working with uh.... "alternate facts" about how much this would cost. Anyhow, I'm trying to gather data. Thanks for the link! I'll take a look!
So far, I see a 3x-4x speedup between long running jobs when they are on a Large vs. Small warehouse.
t
Aaah, yes. Those are fun. I once had to talk a client out of only using XS warehouses for everything. They said they didn't have a lot of data, but their cross joins and the disk spillage that came out of that said otherwise.
c
I'm grateful for the metadata dagster provides in this regard because it makes it a lot easier to compare what's happening.
daggy love 1