C3 AI Documentation Home

Default LLMs and Embedders

The C3 Agentic AI Platform includes pre-configured LLM clients and authentication to support Agent Lifecycle Management features in C3 AI Studio.

Default client and authentication configurations

The following table shows the configured LLM clients and authentication objects for the C3 Agentic AI Platform.

Requirement TypeNameProviderModelNote
GenaiCore.Llm.Bedrock.Auth"default-auth"AWS BedrockN/A
GenaiCore.Llm.AzureOpenAi.Auth"default-auth"Azure OpenAIN/A
GenaiCore.Llm.AzureOpenAi.Auth"default-auth-embedder"Azure OpenAIN/A
GenaiCore.Llm.AzureOpenAi.Auth"default-auth-responses"Azure OpenAIN/A
GenaiCore.Llm.VertexAi.Auth"default-auth"GCP Vertex AIN/A
GenaiCore.Llm.Completion.Client"default-completions"Depends on deploymentClaude Sonnet 4.5Available on GCP, AWS, and Azure
GenaiCore.Llm.Response.Client"default-responses"Azure OpenAIGPT-5.2If the client doesn't allow Azure then this should be ignored
GenaiCore.Llm.Completion.Client"default-completions-web"Vertex AIGemini 2.5 FlashIn the defaultOptions parameter, the following is configured: web_search_options={"search_context_size": "medium"}
GenaiCore.Llm.Completion.Client"claude-sonnet-4.5"AWS BedrockClaude Sonnet 4.5If deployed on AWS
GenaiCore.Llm.Embedding.Client"cohere.embed-multilingual-v3"AWS BedrockCohere Embed Multilingual v3If deployed on AWS
GenaiCore.Llm.Completion.Client"claude-sonnet-4.5"GCP Vertex AIClaude Sonnet 4.5If deployed on GCP
GenaiCore.Llm.Embedding.Client"gemini-embedding-001"GCP Vertex AIGemini Embedding 001If deployed on GCP
GenaiCore.Llm.Completion.Client"claude-sonnet-4.5"Azure OpenAIClaude Sonnet 4.5If deployed on Azure
GenaiCore.Llm.Embedding.Client"embed-v4"Azure OpenAICohere Embed v4If deployed on Azure

Models are provisioned and invoked in the region where your account is deployed by default.

If a model is not available in the region where your cluster or account is deployed, the GenaiCore.Llm.Auth and GenaiCore.Llm.Completion.Client objects will not be configured.

Configuring your own LLMs

To configure your LLM clients please follow Set Up LLM Clients.

See also

Was this page helpful?