I am currently reviewing orchestration tools to re...
# integration-airflow
I am currently reviewing orchestration tools to replace my managed airflow instance. I like the core concept behind Dagster and the fact that I can probably run it with less infrastructure cost (still unknown). I am concerned about a few things I was hoping this channel could help me with. 1. Using the dagster-airflow conversion will "Dynamic Dags" from airflow convert cleanly? 2. How would this "look" in Dagster... I have a single airflow python file that dynamically creates an airflow dag for each customer (>100) so that each customer gets a data pipeline with associated landing tables in a separate BigQuery Project. 3. Is this possible in Dagster without manually/copy paste creating individual assets/IO/Jobs for each customer? Forgive me but I have only been looking at Dagster docs trying to understand the core concepts and capabilities at this point. - Thanks.
dagster bot responded by community 1
Hi @Ron Scott! Regarding topic #3, if you are doing a rewrite of the Airflow DAGs as Dagster objects, you can avoid a lot of copy-paste by using the factory pattern, e.g.:
Copy code
def my_asset_factory(name, some_specific_property):
    @asset(name=name, ...)
    def my_asset(context, ...):
        # you can use "some_specific_property" here to customize
        # assets depending on the client it is associated with
    return my_asset

# Somewhere else in the code:
my_asets_by_client_key = {+
    client_key: my_asset_factory(
    for client_key, some_specific_property
    in some_specific_property_by_clie0nt_key.items()
2. there are two options: 1) if you’d like to keep the dynamic fan out in one run, you can model it as dynamic graph; 2) if you’d like better observability and separate runs, you can model it as dynamic asset partitions where each customer’s task will be separate partitions.
for question #1, im not sure i totally follow. do you mind elaborating?