Hi! We are trying to use policy ids when calling t...
# ask-community
b
Hi! We are trying to use policy ids when calling the jobs/create databricks API, but the dagster_databricks.databricks_pyspark_step_launcher schema does not allow it (run_config > cluster > new). It seems to be outdated, when compared with the Databricks API documentation (new_cluster > policy_id). Is this something you have in your pipeline?
dagster bot responded by community 1
Likewise, the init_scripts tab does not yet support the most recent options, like workspace. DBFS was deprecated
z
Hey I'm working on a PR for this and other updates to the config schema for the databricks step launcher
b
Hey @Zach, thanks! Do you have a link for it or something that we can track?
z
Sure - https://github.com/dagster-io/dagster/pull/16316 is for the init scripts update. I should have a PR out for the policy ID thing later today or tomorrow. I'm hoping to further extend the config schema for the step launcher this week as there have been a number of additions to the
/jobs/runs/submit
endpoint in the last 6 months or so which haven't been reflected in the config schema. However, in the meantime you can use the
create_databricks_submit_run_op
or
create_databricks_run_now_op
to create ops that will execute a Databricks job, using the
databricks_job_configuration
argument with a config dict matching the respective API endpoint. It's not exactly the same functionality as the step launcher as it requires you to point to code that is accessible to Databricks, but might still be useful
👍 1
Here's the policy ID update PR
👍 1