Hi All.
we're seeing tones of data in both the run storage (postgresql) and in the asset/job storage (s3). We'd like to delete old data past lets say 30 days.
for the runs' data we can use dagsters built-in api for deletion - 'context.instance.delete_run(run.dagster_run.run_id)'
however this doesn't seem to delete the object storage. Is there any way to accomplish this other than creating a job that will manually go through the buckets?