Hi all, I am using databricks with dagster as orch...
# ask-community
t
Hi all, I am using databricks with dagster as orchestration tool. I want to execute a databricks notebook as an op in dagster. Is it feasible to use`databricks_pyspark_step_launcher` to do such thing or should I just use
create_databricks_job_op
?. Because it seem like
databricks_pyspark_step_launcher
will just execute api from pyspark package on spark cluster
dagster bot responded by community 2
🤖 1
z
You'll want to use
create_databricks_job_op
for running existing notebooks.
databricks_pyspark_step_launcher
will run code defined in an op on a cluster remotely, as a way of keeping your spark code defined in ops instead of in notebooks