Daniel Kilcoyne
08/17/2023, 8:38 PMstart=a|start_date, end=c|end_date
.
It doesn't look like there's native support, but I'd appreciate if anyone has done a quick workaround before. I'd like to keep the asset multi-partitioned so I can trigger backfills on specific static partitions, and I do not wish to trigger a job run for each partition because I'm running on Spark. Example code below:
daily_multi_partitions_definition = MultiPartitionsDefinition(
{
"static": StaticPartitionsDefinition(["a", "b", "c"]),
"date": DailyPartitionsDefinition(start_date=TEST_DATE),
}
)
@asset(partitions_def=daily_multi_partitions_definition)
def daily_multi_partitions_asset(context: AssetExecutionContext) -> int:
return 3
daily_multi_partitions_job = define_asset_job(
name="daily_multi_partitions_job",
selection=daily_multi_partitions_asset.key.path,
partitions_def=daily_multi_partitions_definition,
)
daily_multi_partitions_job_schedule = build_schedule_from_partitioned_job(
job=daily_multi_partitions_job
)
sean
08/18/2023, 3:01 PMbuild_schedule_from_partitioned_job
and use the lower-level @schedule
API:
date_partitions = daily_multi_partitions_definition.partitions_defs[1]
@schedule(date_partitions.get_cron_schedule())
def daily_multi_partitions_job_schedule(context):
... return desired run request
Daniel Kilcoyne
08/18/2023, 3:13 PM