C3 AI Documentation Home

Non-Standard Deployment Requirements

You can deploy the C3 Agentic AI Platform to suit your organization's needs. To learn more about non-standard deployment options and for supporting documentation, see Non-Standard Deployment Overview.

In a standard cloud deployment, C3 AI Operations installs and deploys the platform for your organization. To learn more about standard deployment requirements and process, see the installation guides at https://c3.ai/legal/.

The following topic decribes infrastructure requirements for non-public cloud and on-premises deployments.

Non-public cloud infrastructure requirements

See the following resources to learn more about non-public cloud regions for your cloud provider:

See the C3 AI Installation Guides for a general idea of non-public cloud requirements. Non-public cloud environments require additional customization. Contact Center of Excellence (CoE) if you require a a non-public cloud deployment.

On-premises infrastructure requirements

C3 AI does not provide hardware or operating system expertise for on-premises deployments. You must procure, install, configure, and maintain on-premises systems on your own hardware.

Non-standard deployment infrastructure needs vary, and the following sections provide minimum infrastructure requirements if your organization opts for an on-premises deployment.

Hardware

C3 AI does not have a hardware vendor requirement. Your hardware must meet the following minimum specifications:

ComponentRequirement
ArchitectureIntel/x86
CPU64 cores
RAM256 GB
Disk storage4 TB

The amount of required resources vary depending on the features and platform capabilities you want to use. These specifications support small clusters with smaller workloads and less feature requirements. For larger clusters with more needs, expect to provide about triple the amount of resources.

Application data requirements must increase by a factor of 3 to 4 to adequately accommodate RAID data replication and backups.

GPU instance sizing for LLM inference

To use the C3 AI Generative AI application, or if you have an application that performs LLM inference, you must provide additional GPU resources. Follow this guidance on how much additional GPU you must provide to support these applications:

Size of model ≈ number of model parameters x 2

GPU memory required for inference ≈ 2 x size of model

For example, if your model has 40 parameters, provide 80 GB of disk space and 160 GB of GPU memory:

Size of model = 40 x 2 = 80 GB

GPU memory required = 80 x 2 = 160 GB

Given this guidance, use a machine that provides at least 80 GB of disk storage and 160 GB of GPU memory.

When you choose a machine, consider cost and performance. For example, NVIDIA H100 and A100 provide the same amount of GPU memory, but H100 provides faster speeds and costs more than A100.

To deploy a VllmPipe, the platform requires GPUs with a compute capability rating of 7.0 or greater. See NVIDIA CUDA GPU for compute compatibility ratings.

(Optional) Virtualization

The platform does not require Kubernetes virtualization, however C3 AI recommends setting up Kubernetes with virtualization. Virtualization allows for an easier deployment process with more flexible use of hardware infrastructure.

Alternatively, you can deploy Kubernetes on a bare metal instance.

Operating system

The platform requires Linux OS and supports the following as distribution options:

  • Red Hat Enterprise Linux
  • Rocky Linux
  • Ubuntu

The platform might support similar Linux distribution options. However, CoE has experience with supporting these distributions.

Federal organizations may require OS configurations that follow Security Technical Implementation Guide (STIG) controls and comply with Federal Information Processing Standards (FIPS). The platform supports OS configurations that follow STIG and FIPS requirements.

Kubernetes

Your organization can deploy Kubernetes using RKE2 or OpenShift.

Alternatively, C3 AI can deploy Kubernetes and install RKE2, Longhorn, MetalLB, and Nexus container registry in an on-premises configuration for your organization.

Command line interface tool requirements

Install the following command line interface (CLI) tools:

Depending on your cloud provider, you might need the additional tools:

Download and apply the C3 AI Terraform modules

C3 AI uses Terraform for platform installation and will provide you a registry username and password to pull the Terraform modules from a JFrog registry. Run the following commands at the command line to download and apply the standard C3 AI Terraform modules.

To learn how to pull and run the Terraform modules for platform installation, see Run Terraform Modules for Platform Installation.

Registry requirement

C3 AI publishes all images necessary for deploying and installing the the platform to registry.c3.ai. If you cannot allow network access to registry.c3.ai and require a custom registry, you must move images from registry.c3.ai to your own registry.

C3 AI recommends using cloud-native registry services if you must use a custom registry:

See the "BOM" section in the C3 AI Platform Install and Upgrade Requirements Versions and Compatibility dropdown for the list of images and which ones you must move to your custom registry.

For a description of what the images are for, see Images Required for Installation and Deployment.

Production-level cluster

If you have a production-level cluster in a non-standard deployment, C3 AI recommends that your organization deploys two clusters.

As a best practice, have another cluster in addition to your production-level cluster. A second cluster minimizes the risk of accidental changes in production and allows you to independently scale and test non-standard deployment configurations.

However, providing infrastructure for two clusters entails higher cost. Consider the cost implications if you choose to deploy a second cluster to support your production-level cluster.

Next step

After your organization meets these requirements for a non-standard deployment, create the required secrets. See Create Secrets for Installation and Deployment.

Was this page helpful?