I'm trying to use a daily partitioned source asset...
# ask-community
r
I'm trying to use a daily partitioned source asset as an input to an op as part of a job. The source asset is using the
snowflake_pandas_io_manager
. I have the
partitions_def
set to a
DailyPartitionsDefinition
for both the
@job
and
SourceAsset
. In dagit, when running the job with a particular partition, such as
2023-03-30
, I'm seeing it query with
WHERE
last_updated_at >= '2023-03-01 00:00:00' AND last_updated_at < '2023-04-03 00:00:00'
, when I'd expect it to be for just the single day (
2023-03-01
is the
start_date
of my partition). Do I need to add another
partitions_def
or am I doing something wrong? Also, I don't see a link between the
@op
/`@job` and
SourceAsset
in dagit, even though I am seeing data come in from the table in snowflake.
d
what
partitions_mapping
are you using?
r
I don't have a
partitions_mapping
defined, where is this needed?
d
oh, maybe that’s just for asset -> asset dependencies. how are you triggering your job?
r
I'm using the
Launch backfill
button in the jobs
Partitions
tab and selecting the partition
s
Hi Ryan - this is a very reasonable request, but there currently isn't an easy way to make this work. The issue, as @Drew You pointed out, is that there isn't a mapping between the partitions targeted by the job and the partitions in the asset that the job is reading from. So we read every partition from the asset. If you'd be up for filing an issue on Github, we can try to find time to look into providing a way to set this mapping.
r
🙏 1