hello, how can I create a <Partition> after I did ...
# integration-airflow
m
hello, how can I create a Partition after I did this?
j
hey @Mirza partitions only exist for Software Defined Assets (https://docs.dagster.io/concepts/assets/software-defined-assets), the
make_schedules_and_jobs_from_airflow_dag_bag
won't create partitions but the schedules will have runs that you can re-execute backfill similar to partitions
if you want to use SDAs and partitions the
load_assets_from_airflow_dag_bag
will work https://docs.dagster.io/_apidocs/libraries/dagster-airflow#dagster_airflow.load_assets_from_airflow_dag
airflow dags/schedules will map more cleanly onto dagster jobs/schedules but if you want to take advantage of SDAs (and partitions) thats the way to do it
m
https://dagster.slack.com/archives/CH2SCAV19/p1686787656021319?thread_ts=1686786930.256189&amp;cid=CH2SCAV19 I see, if I still want to use
make_schedules_and_jobs_from_airflow_dag_bag
could I just add more schedules in the
staging_data_transfer_schedules
generated then?
j
yes you could do that, only the schedules you return to the
Definitions
class def or
@repository
decorator will be used
m
seems like we can only have 1 schedule per job ^^a
how about putting the partitioned_config in
staging_data_transfer_jobs
like this
Copy code
for job in staging_data_transfer_tracking_events_jobs:
    job.partitioned_config = PartitionedConfig( ..
is it possible?
sorry if it's too forced, I have tried
load_assets_from_airflow_dag_bag
but it returns module not found error in our dagster deployment
j
ah sorry its
load_assets_from_airflow_dag
I might also be missing what you're trying to accomplish, why do want to use partitions for these jobs?
m
sorry for the late reply I just want to do backfill, checking the dagster doc I have to create partition first we got the task script in a docker image so I try using KubernetesPodOperator in Airflow do you have any better idea to backfill it?