Hi 👋 I am looking into the
Spark integration documentation so that I can launch spark jobs on AWS EMR. Do I understand correctly that there is no OP which will launch an EMR cluster and then run a step on it, and that instead the EMR cluster should already be running and Dagster can add / launch a step on that cluster? I am looking for something similar to Airflow's
EmrCreatejobFlowOperator.