Temporary credentials
Authenticate using temporary credentials in the create or update Deployment flow. Temporary credentials provide better security since they are not saved and expire (more) quickly.
Object storage
Select private object storage, and click on temporary credentials to authenticate using temporary credentials.
- S3
- Azure Blob Storage
- Databricks
- Use AWS STS to assume a role with access to your bucket
- Get a session token
- In the Deployment configuration, click Private object storage, then click Temporary credentials, then Use temporary credentials
- Select S3, copy paste your:
- Access key ID (e.g.
ASIAIOSFODNN7EXAMPLE
) - Secret access key (e.g.
ASIAwJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEYIOSFODNN7EXAMPLE
) - Session token (e.g.
AQoDYXdzEJr...<remainder of session token>
)
- Click Save
- Go to your Azure storage account
- Select Security + networking -> Shared access signature
- Generate a SAS token with minimally:
- Allowed services: Blob
- Allowed resource types: Container, Object
- Allowed permissions: Read, List
- In the Deployment configuration, click Private object storage, then click Temporary credentials, then Use temporary credentials
- Select Azure Blob Storage, copy paste your:
- SAS token (e.g.
bsv=2023-11-31&sr=b&sig=39Up9jzHkxhUIhFEjEh954DJxe6cIRCgOv6IGCSØ%3A377&sp=rcw
) - Object storage name (e.g.
deeploy
)
- Click Save
- In Databricks, go to Settings, click Identity and access, then click Manage Service Principles
- Add a new Service Principle with Workspace access
- Give the Service Principle the required permissions
- Click on the created Service Principle, click Secrets, then Generate secret
- Use the generated secret and client ID to do an authenticated request to the token endpoint to receive a temporary workspace-level access token
- Select Databricks Unity Catalog, copy paste your:
- Access token (e.g.
eyJraWQiOiIy...<remainder of access token>
)
- Click Save
Docker
Select private registry, and click on temporary credentials to authenticate using temporary credentials.
- DockerHub
- AWS
- Azure
- GCP
- For DockerHub, fill in:
- Registry (e.g.
https://index.docker.io/v1/
) - Username (e.g.
deeployml
) - Password (e.g.
<password>
)
- Click Save
- Use the AWS CLI to get a temporary login password using
aws ecr get-login-password --region region
- For Amazon Elastic Container Registry, fill in:
- Registry (e.g.
012345678901.dkr.ecr.eu-central-1.amazonaws.com
) - Username (
AWS
) - Password (e.g.
eyJraWQiOiIy...<remainder of access token>
)
- Click Save
- Use the Azure CLI to get a temporary access token using
az acr login --name <acrName> --expose-token
- For Azure Container Registry, fill in:
- Registry (e.g.
deeploy.azurecr.io
) - Username (
00000000-0000-0000-0000-000000000000
) - Password (e.g.
eyJhbGci...<remainder of access token>
)
- Click Save
- Create a service account for the artifact registry, and grant it repository access
- Generate an access token for the service account
- For Google Artifact Registry, fill in:
- Registry (e.g.
europe-central2-docker.pkg.dev/repository/project-name
) - Username (
oauth2accesstoken
) - Password (e.g.
ya29.c.c...<remainder of access token>
)
- Click Save