Oliver06/23/2022, 11:23 PM
Zach06/23/2022, 11:43 PM
Oliver06/23/2022, 11:47 PM
load_from: - python_package: tmp
definition. some of my assets are defined as
..now that you mention it, I think it's the
that are having this issue
) but that's not ideal
sandy06/24/2022, 12:21 AM
? or manually specifying config in the launchpad at the time you launch the job?
Oliver06/24/2022, 1:33 AM
dev_asset_job = build_asset_selection_job("dev", assets=resourced_assets, source_assets=, config=yaml.safe_load(Path("tmp/dev.yaml").read_text()), resource_defs=dev_resource_defs, )
sandy06/24/2022, 3:20 PM
By the way, any particular reason to use
@repository def dev_repo(): resource_config = yaml.safe_load(Path("tmp/dev.yaml").read_text()) return [with_resources(assets, dev_resource_defs, resource_config), define_asset_job("dev", selection=AssetSelection.assets(assets))]
? The former is an internal API that could change in a future release.
Oliver06/26/2022, 2:27 AM
allows you to specify the executor while
does not which is why I'm using it. Could switch that to be defined on the repo but it's a little awkward for the workflow im developing. I have one job that uses local multiprocessing executor for quick dev and testing. Then I have another job that uses a custom executor to run pipelines on infra in a cloud environment to test on more large scale datasets
run fine even without supplying config
sandy06/27/2022, 3:46 PM
I have one job that uses local multiprocessing executor for quick dev and testing. Then I have another job that uses a custom executor to run pipelines on infra in a cloud environment to test on more large scale datasetsGot it - do you want these jobs to show up in Dagit at the same time? Our typical recommendation is to put these jobs in different dagster repos, so that you can target one with your production setup and another with your local dev setup.
Oliver06/29/2022, 2:01 AM