we have a graph defined asset that is an op that f...
# ask-community
d
we have a graph defined asset that is an op that fans out the data, an op that processes the chunk of data, and then a collector op. I would like each op that processes the chunk of data to run in it's own separate docker container. When I specify the sensor job to use the docker_executor, it's unclear whether or not a separate docker container is created for each of the processing ops, but my gut tells me it's all running in the single docker container. Is there any way to specify an interior op in a job to be within it's own container?
this makes me suspect it's launching each step inside of the same docker container, instead of spinning up a new one for each
Copy code
@graph
def full_scored_data_set(recruiter_team_to_score, trained_model) -> pd.DataFrame:
    """chunk the recruiters to score and score them in batches"""
    result = (
        chunking_op(recruiter_team_to_score, trained_model)
        .map(generate_score_for_ids)
        .collect()
    )
    return append_scores(result)
basically I want the .map(generate_score_for_ids) to each be run in a separate docker container. Can I have the chunking_op (which yields a dynamic output), be run as a separate asset that my graph takes in?