Hi :wave: I am looking into the <Spark integration...
# ask-community
Hi 👋 I am looking into the Spark integration documentation so that I can launch spark jobs on AWS EMR. Do I understand correctly that there is no OP which will launch an EMR cluster and then run a step on it, and that instead the EMR cluster should already be running and Dagster can add / launch a step on that cluster? I am looking for something similar to Airflow's EmrCreatejobFlowOperator.
I guess I am answering my own question here but looking at the code here, we can call EmrJobRunner().run_job_flow(). Is this the correct way to use this class and method? 🙂
Hey Daniel - good code-spelunking. Yes, that is correct!
🌈 1
Great, thanks! 🙂