https://dagster.io/ logo
#ask-community
Title
# ask-community
t

tamolo

05/24/2022, 11:03 AM
Hi all, I am using databricks with dagster as orchestration tool. I want to execute a databricks notebook as an op in dagster. Is it feasible to use`databricks_pyspark_step_launcher` to do such thing or should I just use
create_databricks_job_op
?. Because it seem like
databricks_pyspark_step_launcher
will just execute api from pyspark package on spark cluster
dagster bot responded by community 2
🤖 1
z

Zach

05/24/2022, 5:10 PM
You'll want to use
create_databricks_job_op
for running existing notebooks.
databricks_pyspark_step_launcher
will run code defined in an op on a cluster remotely, as a way of keeping your spark code defined in ops instead of in notebooks
2 Views