# ask-ai


02/20/2024, 5:13 PM
This message was deleted.


02/20/2024, 5:13 PM
Yes, you can create an asset in Dagster that submits a Databricks job. The documentation provides guidance on how to model a Databricks API call as either a Dagster op or as the computation backing a software-defined asset. You can use the
op factories to create ops that invoke the Databricks Jobs' APIs, such as [Run Now]( or [Submit Run](, or you can manually create a Dagster op or asset that connects to Databricks using the configured Databricks resource. After defining the op or asset, you would create a Dagster job that invokes the op or selects the asset to run the Databricks API call. If a Databricks job fails, and the failure is captured by the Dagster asset or op that submitted the job, it will be reflected in Dagster. For example, if you have an asset that submits a Databricks job and the job fails, Dagster's asset observation and materialization mechanisms can be used to surface this information. Dagster's asset catalog and Dagit UI can display the status of assets, including any failures that occur during execution. For more detailed guidance on integrating Databricks with Dagster, including how to handle Databricks job submissions and failures, you can refer to the documentation on [Integrating Databricks with Dagster Pipes]( and [Using Dagster with Databricks]( These resources provide examples and explanations on how to set up and manage interactions between Dagster and Databricks.