Skip to main content
Connect commercial and custom models to Agent Platform.

Overview

External models are AI models hosted outside the platform. Once connected, they can be used across Agent Platform in Agentic Apps, Prompt Studio, Tools, and Evaluation Studio. Supported Providers (Easy Integration):
ProviderAuthenticationTool Calling
OpenAIAPI Key
AnthropicAPI Key
GoogleAPI Key
CohereAPI Key
Azure OpenAIAPI Key + Endpoint
Amazon BedrockIAM Role ARN
Custom Models (API Integration): Connect any model via REST API endpoint. For the complete list of supported models, see Supported Models.

Manage Connected Models

View Models

  • Go to ModelsExternal Models to see all connected models.

Manage Connections

Each model can have multiple connections with different API keys, enabling separate usage tracking and billing.
ActionDescription
Inference ToggleEnable/disable model availability across platform
EditUpdate API key or credentials
DeleteRemove the connection
When adding multiple API keys for the same model, each connection must have a unique name and API key. In Agentic Apps, you can assign specific connections at the Agent or Supervisor level.

Add a Model via Easy Integration

Use Easy Integration for commercial providers with API keys or IAM roles.

Standard Providers (OpenAI, Anthropic, Google, Cohere)

  1. Go to ModelsExternal ModelsAdd a model
  2. Select Easy Integration → click Next
  3. Choose your provider → click Next
  4. Select a model from the supported list
  5. Enter a Connection name and your API key
  6. Click Confirm
The model is now available across Agent Platform.

Amazon Bedrock

Bedrock uses IAM role-based authentication instead of API keys. Prerequisites: Create an IAM role in AWS with Bedrock permissions and a trust policy allowing Agent Platform to assume the role. See Configuring Amazon Bedrock for IAM setup. Steps:
  1. Go to ModelsExternal ModelsAdd a model
  2. Select Easy IntegrationAWS BedrockNext
  3. Configure credentials and model details:
FieldDescription
IAM Role ARNYour IAM role with Bedrock permissions
Trusted Principal ARNPlatform’s AWS principal (pre-populated)
Model NameInternal identifier
Model IDBedrock Model ID or Endpoint ID
RegionAWS region of the model
HeadersOptional custom headers
  1. Configure model settings using Default or Existing Provider Structures
  2. Click Confirm

Add a Model via API Integration

Use API Integration for custom endpoints or self-hosted models.
Note: For Agentic Apps compatibility, custom models must support tool calling and follow OpenAI or Anthropic request/response structures.

Steps

  1. Go to ModelsExternal ModelsAdd a model
  2. Select Custom Integration → click Next
  3. Enter basic configuration:
FieldDescription
Connection NameUnique identifier
Model Endpoint URLFull API endpoint URL
Authorization ProfileSelect configured auth profile or None
HeadersOptional key-value pairs for requests
  1. Configure model settings using Default or Existing Provider Structures
  2. Click Confirm

Model Configuration Modes

When using API Integration or advanced Bedrock setup, choose one of these configuration modes:

Default Mode

Manually configure request/response handling for complete control. 1. Define Variables
Variable TypeDescription
PromptPrimary input text (required)
System PromptSystem instructions (optional)
ExamplesFew-shot examples (optional)
Custom VariablesAdditional dynamic inputs with name, display name, and data type
2. Configure Request Body Create JSON payload using {{variable}} placeholders:
{
  "model": "your-model-name",
  "messages": [
    {"role": "system", "content": "{{system.prompt}}"},
    {"role": "user", "content": "{{prompt}}"}
  ],
  "max_tokens": 1000,
  "temperature": 0.7
}
3. Map Response JSON Paths Click Test to send a sample request, then configure extraction paths:
FieldDescriptionExample
Output PathLocation of generated textchoices[0].message.content
Input TokensInput token countusage.prompt_tokens
Output TokensOutput token countusage.completion_tokens

Existing Provider Structures Mode

Automatically apply pre-defined schemas from known providers. Recommended when your model follows a standard API format. 1. Select Provider Template
TemplateUse When
OpenAI (Chat Completions)Model follows OpenAI chat API format
Anthropic (Messages)Model follows Anthropic messages API format
Google (Gemini)Model follows Gemini API format
2. Enter Model Name Specify the model identifier for request bodies. 3. Enable Model Features Enable only features your model supports:
FeatureDescription
Structured ResponseJSON-formatted outputs for Prompts and Tools
Tool CallingFunction calling for Agentic Apps and AI nodes
Parallel Tool CallingMultiple tool calls per request
StreamingReal-time token generation for Agentic Apps
Data GenerationSynthetic data generation in Prompt Studio
ModalitiesText-to-Text, Text-to-Image, Image-to-Text, Audio-to-Text
Warning: Enabling unsupported features may cause unexpected behavior.

Troubleshooting

IssueSolution
Test failsVerify endpoint URL and authentication
Empty responseCheck JSON path mapping matches response structure
Model not in dropdownsEnsure Inference toggle is ON
Tool calling not workingVerify model supports it and feature is enabled
Bedrock connection failsCheck IAM role ARN and trust policy configuration