Skip to main content
Version: Cloud

AWS cloud resources

We advice to run Deeploy on the following managed AWS services:

  • EKS: Managed Kubernetes to run the Deeploy software and AI/ML deployments on
  • RDS (Aurora) for PostgreSQL: Managed database to store application and AI/ML deployment data
  • S3: Object storage to store repository and model files and artifacts
  • KMS: Key management and encryption to story sensitive data encrypted in the database

The following AWS reference architecture is advised for a basic Deeploy installation:

AWS reference architecture. See networking for more guidance in the networking setup.

For production workloads, we advise to manage the cloud resources in code. AWS provides providers to manage you infrastructure as code. See the Terraform documentation for Terraform/OpenTofu providers.

If you are planning to use Deeploy in combination with the AWS marketplace, more guidance can be found here.

EKS

We suggest using a managed Kubernetes cluster (Stateless): AWS EKS.

To set up AWS EKS for your Deeploy installation, follow the steps outlined in the AWS EKS guide. Keep in mind the following considerations:

  • For normal usage, Deeploy requires approximately 3 medium nodes; minimal requirements: 3 (v)CPU and 6 GB RAM
  • Kubernetes version: we advise to use only standard supported versions to prevent extra costs
  • Enable autoscaling for your EKS cluster. This prevents running into resource limits, but take into account this also results in dynamic costs
  • Deeploy support multiple types of loadbalancing with EKS, check what best practices suit your situation here

GPU support

For GPU support in EKS we recommend attaching an autoscaling node group with your preferred GPU node type that can scale to 0 to your EKS cluster. Alternatively use Karpenter as a flexible scheduler. To make sure no other pods will be scheduled on GPU nodes you can add the following taint to your node pool:

taints:
- key: nvidia.com/gpu
value: present
effect: NoSchedule

When specifying a GPU node for a Deeploy deployment, automatically the nvidia.com/gpu label will be applied. To make sure you can select the nodes that are scaled to 0 in Deeploy, add a list of node types to the values.yaml file when you install the Deeploy Helm chart.

Read more about our NVIDIA integration here

RDS for PostgreSQL

Deeploy supports all Postgres databases offered by AWS. For example follow the steps outlined in the AWS RDS for PostgreSQL guide. Take into account the following considerations when setting up your database:

  • Enable AWS RDS Storage Autoscaling to prevent manual interventions, to accommodate the amount of data increasing over time.
  • Align the network configuration of the RDS database with the EKS cluster (same VPC and subnets). This will allow for data transfers over the internal AWS network.
  • Implement best practices for backing up and restoring data at any point in time, as described in this article.
  • Create a separate user with admin rights only on the required databases (deeploy and deeploy-kratos). Save the user credentials to use in the Helm values.yaml file that you use in the Deeploy Helm installation.

Database configuration

Make sure that the Postgres database server has the following two databases:

  1. deeploy
  2. deeploy_kratos
info
  • A user should have administrative rights on both databases.
  • The databases should have at least one (public) database schema.

S3

To set up an AWS S3 bucket, use the AWS S3 guide. Additionally, keep in mind the following considerations:

  • We suggest using a single user assigned IAM role to access the S3 bucket from the EKS cluster using an IAM assumable role with OIDC, with the following minimal required IAM access policy:
{.
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:ListBucket",
"s3:GetBucketLocation",
"s3:ListBucketMultipartUploads",
"s3:ListBucketVersions",
],
"Resource": [
"arn:aws:s3:::<YOUR-BUCKET-NAME>"
]
},
{
"Effect": "Allow",
"Action": [
"s3:*Object*",
"s3:ListMultipartUploadParts",
"s3:AbortMultipartUpload",
],
"Resource": [
"arn:aws:s3:::<YOUR-BUCKET-NAME>/*"
]
}
]
}
  • Allow EKS pods to assume your role. By providing the role for the key objectStorage.aws.trustedIamRoleArn.eks\.amazonaws\.com/role-arn in the Deeploy values during the installation, the relevant Kubernetes service accounts will be automatically annotated. Moreover make sure the objectStorage.aws.useEKSPodIdentityWebhook is set to true.
  • Create an S3 Gateway endpoint for your VPC. This will allow for data transfers over the internal AWS network.

KMS

To set up a AWS KMS Symmetric Encryption key, use the AWS KMS Guide, keep in mind the following considerations:

  • We suggest using a single IAM role to access the KMS from the EKS cluster using an IAM assumable role with OIDC, with the following minimal required IAM access policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"kms:Encrypt",
"kms:Decrypt",
"kms:ReEncrypt*",
"kms:GenerateDataKey*",
"kms:DescribeKey"
],
"Resource": "*"
}
]
}
  • Allow EKS pods to assume your role. By providing the role for the key security.keyManagement.aws.trustedIamRoleArn.eks\.amazonaws\.com/role-arn in the Deeploy values during the installation, the relevant Kubernetes service accounts will be automatically annotated. Moreover make sure the security.keyManagement.aws.useEKSPodIdentityWebhook is set to true.

Networking

We suggest becoming familiar with AWS specific networking as described in this guide. Follow the best practices for networking for the EKS and RDS deployments. Check the AWS reference architecture at the top of the page for our recommendations.

Security considerations

  • Deploy the RDS and EKS cluster within the same VPC but use different security groups to specifically whitelist the EKS nodegroup within the RDS database.

Resource health

We suggest setting up CloudWatch for monitoring and alerting related to resource health. If you suspect an issue with your AWS Cloud Resources, check out the service health dashboard.

Estimation of costs

Next to the Deeploy license costs, AWS will bill you for the cloud resources. Check the expected costs with the AWS cost calculator.

Next steps