This message was deleted.
# ask-ai
s
This message was deleted.
s
Yes, you can create an asset in Dagster that submits a Databricks job. The documentation provides guidance on how to model a Databricks API call as either a Dagster op or as the computation backing a software-defined asset. You can use the
dagster-databricks
op factories to create ops that invoke the Databricks Jobs' APIs, such as [Run Now](https://docs.databricks.com/api-explorer/workspace/jobs/runnow) or [Submit Run](https://docs.databricks.com/api-explorer/workspace/jobs/submit), or you can manually create a Dagster op or asset that connects to Databricks using the configured Databricks resource. After defining the op or asset, you would create a Dagster job that invokes the op or selects the asset to run the Databricks API call. If a Databricks job fails, and the failure is captured by the Dagster asset or op that submitted the job, it will be reflected in Dagster. For example, if you have an asset that submits a Databricks job and the job fails, Dagster's asset observation and materialization mechanisms can be used to surface this information. Dagster's asset catalog and Dagit UI can display the status of assets, including any failures that occur during execution. For more detailed guidance on integrating Databricks with Dagster, including how to handle Databricks job submissions and failures, you can refer to the documentation on [Integrating Databricks with Dagster Pipes](https://docs.dagster.io/guides/dagster-pipes/databricks) and [Using Dagster with Databricks](https://docs.dagster.io/integrations/databricks). These resources provide examples and explanations on how to set up and manage interactions between Dagster and Databricks.