Hi all, can a job be created with multiple graphs?...
# ask-community
p
Hi all, can a job be created with multiple graphs? I normally use the
to_job
on a graph to create the job but would like to create a job which executes multiple graphs. Currently I am launching them as follows:
Copy code
train = train_graph.to_job(
    name="dev__train",
    config=config_from_files([file_relative_path(__file__, "dev__train.yaml")]),
    resource_defs={
        "hydra_config_manager": hydra_config_manager,
        "secrets_manager": secrets_manager,
        "mlflow_config_manager": mlflow_config_manager,
    },
)

predict = predict_graph.to_job(
    name="dev__infer",
    config=config_from_files([file_relative_path(__file__, "dev__infer.yaml")]),
    resource_defs={"hydra_config_manager": hydra_config_manager},
)
What I am trying to do is as follows:
Copy code
@job
def train_and_predict_job():
    
    train_graph()
    predict_graph()
However, how can I specify the configurations and resources for the graphs from here? Additionally, this will be called from a schedule and I would like to specify separate `run_config`dictionaries for the two parts of the job. This is because they share some keys (with different values). I am wondering if that is also possible to specify here
j
hey @Pankaj Daga the
@job
decorator takes
resource_defs
and
config
arguments https://docs.dagster.io/_apidocs/jobs#dagster.job as for have the same config keys for each graph, you’ll likely run into some issues there, as i don’t believe op config is nested under the graph name. you may want to make the keys unique
p
yeah, making these keys unique makes it quite difficult. I think I will need to find another way
j
the
@graph
decorator also takes a
config
argument, so you may be able to provide the config to each graph there. i’m still not sure how the conflicting key names will resolve though, you’d have to try it and see
❤️ 1