https://dagster.io/ logo
#ask-community
Title
# ask-community
j

Jake Kagan

03/15/2023, 2:07 PM
is there a better alternative to how my workflow is structured? it seems like i maybe be adding too much overhead, but i couldn't find any guidelines for example, here is my top level `__init__`:
Copy code
from .jobs import jobs, schedules, assets
from dagster import Definitions

jobs = Definitions(jobs=[*jobs],
                   schedules=[*schedules],
                   assets=[*assets])
here is my `jobs.init`:
Copy code
from .beginning_inventory.retention_beginning_inventory__defs import \
   assets as retention_beginning_inventory__defs__assets

from .cantax_usage_from_bigq.cantax_usage_from_bigq__defs import \
   jobs as cantax_usage_from_bigq__defs__jobs, \
   schedules as cantax_usage_from_bigq__defs__schd


jobs = [*cantax_usage_from_bigq__defs__jobs,
]
assets = [*retention_beginning_inventory__defs__assets,
]
schedules = [*cantax_usage_from_bigq__defs__schd,
]
from there, i need to have a
__defs.py
which lists out all the definitions for a workflow:
Copy code
from . import retention_beginning_inventory_load as load

# -------------------------------------------- DEFINITIONS -------------------------------------------- #
assets = [
	load.beg_inv_excel_to_s3,
	load.beg_inv_s3_to_rs,
]
on top of all this, i needed to make a naming decorator in order to have dagit show the location of the job file in a clearer way:
Copy code
@job(resource_defs={**RESOURCE_DEFS, "io_manager": mem_io_manager, },
     executor_def=in_process_executor)
@name_job(str(__file__), 'incremental')