Temporary credentials
Authenticate to your private object storage using temporary credentials while creating or updating your Deployment. Temporary credentials provide better security since they expire (more) quickly.
- S3
- Azure Blob Storage
- Databricks
- Use AWS STS to assume a role with access to your bucket
- Get a session token
- In the Deployment configuration, click Private object storage, then click Temporary credentials, then Use temporary credentials
- Select S3, copy paste your:
- Access key ID (e.g.
ASIAIOSFODNN7EXAMPLE
) - Secret access key (e.g.
ASIAwJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEYIOSFODNN7EXAMPLE
) - Session token (e.g.
AQoDYXdzEJr...<remainder of session token>
)
- Click Save
- Go to your Azure storage account
- Select Security + networking -> Shared access signature
- Generate a SAS token with minimally:
- Allowed services: Blob
- Allowed resource types: Container, Object
- Allowed permissions: Read, List
- In the Deployment configuration, click Private object storage, then click Temporary credentials, then Use temporary credentials
- Select Azure Blob Storage, copy paste your:
- SAS token (e.g.
bsv=2023-11-31&sr=b&sig=39Up9jzHkxhUIhFEjEh954DJxe6cIRCgOv6IGCSØ%3A377&sp=rcw
) - Object storage name (e.g.
deeploy
)
- Click Save
- In Databricks, go to Settings, click Identity and access, then click Manage Service Principles
- Add a new Service Principle with Workspace access
- Give the Service Principle the required permissions
- Click on the created Service Principle, click Secrets, then Generate secret
- Use the generated secret and client ID to do an authenticated request to the token endpoint to receive a temporary workspace-level access token
- Select Databricks Unity Catalog, copy paste your:
- Access token (e.g.
eyJraWQiOiIy...<remainder of access token>
)
- Click Save