We want to store newsletter data in BigQuery. Since the statistics change even after sending (open rate, click rate), we need to update the data retroactively. Currently, we have defined the data as a partitioned asset in Dagster, which we export to BQ via BigQueryPandasIOManager. Using the UI or the CLI (via dagster job backfill), backfilling the data works fine.
However, we currently fail to schedule the job. build_schedule_from_partitioned_job would only fill the last partition, even per Freshness Policy and Auto Materialization we got nowhere. Is there a way to schedule the backfill of a partitioned asset in BigQuery? Maybe by triggering a CLI action via scheduled job?