Skip to main content
Version: 1.45

Creating Databricks Unity Catalog Deployments

With the Databricks Unity Catalog integration set up, you can create Deployments that typically involves the steps outlined in Creating a Deployment/. For SageMaker and Azure Machine Learning Deployments, using the Databricks Unity Catalog is not supported.

Prerequisites

  • You added a Repository that adheres to the requirements. Note that Repositories used for Databricks Unity Catalog Deployments must use the reference system.
  • You either set a temporary Databricks Unity Catalog access token in the deployment flow, or ensure you have set up a personal access token in the integration credentials (not recommended).

Updating the Deployment

If you have configured your model using alias and transitioned a new model to that alias, you can update your model and explainer without needing to reconfigure your Deeploy settings. You can either update via the Deeploy application (go to the Deployment details, then click Update) or call Deeploy's API to patch a Deployment with an empty request body. See our swagger documentation for more information.

info

If you make use of the temporary access token, make sure to include it when updating your Deployment.

PyFunc models

To deploy PyFunc models in Deeploy, see the MLFlow integration page