https://dagster.io/ logo
#integration-airflow
Title
# integration-airflow
m

Mirza

06/14/2023, 11:55 PM
hello, how can I create a Partition after I did this?
j

Joe

06/15/2023, 12:07 AM
hey @Mirza partitions only exist for Software Defined Assets (https://docs.dagster.io/concepts/assets/software-defined-assets), the
make_schedules_and_jobs_from_airflow_dag_bag
won't create partitions but the schedules will have runs that you can re-execute backfill similar to partitions
if you want to use SDAs and partitions the
load_assets_from_airflow_dag_bag
will work https://docs.dagster.io/_apidocs/libraries/dagster-airflow#dagster_airflow.load_assets_from_airflow_dag
airflow dags/schedules will map more cleanly onto dagster jobs/schedules but if you want to take advantage of SDAs (and partitions) thats the way to do it
m

Mirza

06/15/2023, 1:25 AM
https://dagster.slack.com/archives/CH2SCAV19/p1686787656021319?thread_ts=1686786930.256189&cid=CH2SCAV19 I see, if I still want to use
make_schedules_and_jobs_from_airflow_dag_bag
could I just add more schedules in the
staging_data_transfer_schedules
generated then?
j

Joe

06/15/2023, 1:57 AM
yes you could do that, only the schedules you return to the
Definitions
class def or
@repository
decorator will be used
m

Mirza

06/15/2023, 9:46 AM
seems like we can only have 1 schedule per job ^^a
how about putting the partitioned_config in
staging_data_transfer_jobs
like this
Copy code
for job in staging_data_transfer_tracking_events_jobs:
    job.partitioned_config = PartitionedConfig( ..
is it possible?
sorry if it's too forced, I have tried
load_assets_from_airflow_dag_bag
but it returns module not found error in our dagster deployment
j

Joe

06/15/2023, 12:03 PM
ah sorry its
load_assets_from_airflow_dag
I might also be missing what you're trying to accomplish, why do want to use partitions for these jobs?
m

Mirza

06/16/2023, 9:51 AM
sorry for the late reply I just want to do backfill, checking the dagster doc I have to create partition first we got the task script in a docker image so I try using KubernetesPodOperator in Airflow do you have any better idea to backfill it?