Oliver
06/23/2022, 11:23 PMZach
06/23/2022, 11:43 PMload_assets_from_modules
?Oliver
06/23/2022, 11:47 PMload_from:
- python_package: tmp
and tmp/__init__.py
has an @repository
definition.
some of my assets are defined as @asset
some from AssetsDefinition.from_graph
..now that you mention it, I think it's the AssetsDefinition.from_graph
that are having this issueOliver
06/23/2022, 11:47 PMOliver
06/24/2022, 12:02 AMAssetsDefinition.from_graph
Oliver
06/24/2022, 12:20 AMasset.configured
) but that's not idealsandy
06/24/2022, 12:21 AMdefine_asset_job
? or manually specifying config in the launchpad at the time you launch the job?Oliver
06/24/2022, 1:33 AMbuild_asset_selection_job
dev_asset_job = build_asset_selection_job("dev", assets=resourced_assets, source_assets=[], config=yaml.safe_load(Path("tmp/dev.yaml").read_text()),
resource_defs=dev_resource_defs,
)
sandy
06/24/2022, 3:20 PM@repository
def dev_repo():
resource_config = yaml.safe_load(Path("tmp/dev.yaml").read_text())
return [with_resources(assets, dev_resource_defs, resource_config), define_asset_job("dev", selection=AssetSelection.assets(assets))]
By the way, any particular reason to use build_asset_selection_job
instead of define_asset_job
? The former is an internal API that could change in a future release.Oliver
06/26/2022, 2:27 AMbuild_asset_selection_job
allows you to specify the executor while define_asset_job
does not which is why I'm using it. Could switch that to be defined on the repo but it's a little awkward for the workflow im developing.
I have one job that uses local multiprocessing executor for quick dev and testing. Then I have another job that uses a custom executor to run pipelines on infra in a cloud environment to test on more large scale datasetsOliver
06/26/2022, 2:30 AM@asset
run fine even without supplying configsandy
06/27/2022, 3:46 PMI have one job that uses local multiprocessing executor for quick dev and testing. Then I have another job that uses a custom executor to run pipelines on infra in a cloud environment to test on more large scale datasetsGot it - do you want these jobs to show up in Dagit at the same time? Our typical recommendation is to put these jobs in different dagster repos, so that you can target one with your production setup and another with your local dev setup.
Oliver
06/29/2022, 2:01 AM