I'm currently updating our Dagster instance to assets. Our simple job moves data from our database to S3 in daily batches. How do I best approach migrating? I tried:
• Building an asset that has s3/the db as resources
• With a config schema, where I pass in the query/bucket (very useful for testing)!
• A partitions definition
Now, I'm using
dagster.define_asset_job
to create the daily refresh task (which contains the config/partitions def). This works quite well so far, I'm just wondering if that's the recommended approach in general. Thanks!