Trying to set the local_artifact_storage.config.ba...
# ask-community
p
Trying to set the local_artifact_storage.config.base_dir and comput_logs.config.base_dir parameters at run time. Something like:
Copy code
run_config = {'local_artifact_storage':
                  {'config':
                       {'base_dir': r'C:\Users\peter\dagster'
                        }
                   }
              }
How can I add this to an @repository ?
Copy code
@repository
def my_repo():
    return [test_asset1, test_asset2]
q
Can this be configured in
dagster.yaml
?
p
yes it can. but, this requires the yaml to be in the project folder, or to be pointed to using dagit -w <path_to_yaml>/dagster.yaml Typically all my config files are in subfolders of the users $home folder -> this allows platform agnostic install (windows & unix) and keeps config files out of the git repo. It also requires the path to be known when dagster is launched. What I would like is a dynamic path e.g. via a get_config() function.
o
for local artifact storage, assuming that your goal is to control where the pickled asset outputs are stored, you can configure the default io manager with:
Copy code
run_config = {"resources": {"io_manager": {"config": {"base_dir": "<some_base_dir>"}}}}
or
Copy code
from dagster import fs_io_manager
@repository
def my_repo():
    return with_resources([test_asset1, test_asset2], resource_defs={"io_manager": fs_io_manager.configured({"base_dir": "some_base_dir"})
p
Awesome
exactly the syntax I was struggling with
o
for compute logs, unfortunately it is only possible to configure their location at the instance level
🌈 1
p
the context config stuff is a concept I'm still learning tbh
yeah, this is a good start though, means within the team we can share a common file system (even if run logs and instantiation isn't fully shared)
🎉 1
o
one more note (not sure if relevant), but if the user has a DAGSTER_HOME environment variable set (pointing at a directory w/ a dagster.yaml file in it), then you don't need to pass that path into dagster-daemon or dagit invocations
p
yep I saw that. So we could all have a dagster.yaml with our file paths, then just set the home just before typing dagit
gotta say, dagster seems way simpler than flyte
o
glad to hear it! 😄
p
just gotta get a working PoC and then convince the team we adopt it. (we need a framework)
o
let me know if you run into any roadblocks 🙂
🙏 1
p
Making progress 🙂 I have somethign like this -> basically I want the base_dir that is passed to assets to be a calculated value. The job here works, and a typical use would be to open the launchpad in dagit, type in an id which then returns a path_to_folder as job output. How do you configure something similar for the assets? Code below doesn't work for the assets, only the get_fldr job So, launch dagit, type in an id, then materialize assets&resources which use the id as a config -> or rather, the output from the job as an input.
Copy code
from dagster import load_assets_from_package_module, repository, with_resources, op, job

@op(config_schema={'param_set_id': int})
def get_ws_root_from_param_set_id(context) -> str:
    folders = {1: '/fldr1', 2: '/fldr2'}
    return folders.get(context.op_config['param_set_id'])

@job
def get_fldr() -> str:
    get_ws_root_from_param_set_id()


@repository
def results_set():
    return [
        *with_resources(
            load_assets_from_package_module(assets)
            , resource_defs={
                "io_manager": local_pkl_io_manager.configured(
                    {"base_dir": get_ws_root_from_param_set_id}),
            }
            ,
        ),
        get_fldr
    ]