Hi All. we're seeing tones of data in both the run...
# ask-community
y
Hi All. we're seeing tones of data in both the run storage (postgresql) and in the asset/job storage (s3). We'd like to delete old data past lets say 30 days. for the runs' data we can use dagsters built-in api for deletion - 'context.instance.delete_run(run.dagster_run.run_id)' however this doesn't seem to delete the object storage. Is there any way to accomplish this other than creating a job that will manually go through the buckets?
D 1
p
Are the objects in S3 from use of an IO manager? We don’t have data retention APIs or TTLs for persisted output from jobs. Your best bet might be to run a job on a schedule that prunes things from your bucket (as you identified).
y
thanks for the response @prha; i figured