Add a Model via Easy Integration
Connect models from OpenAI, Anthropic, Google, Cohere, or Amazon Bedrock using guided setup.Before You Begin
- Obtain an API key from your provider
- For Amazon Bedrock: Set up IAM role with appropriate permissions (see Configuring Amazon Bedrock)
Add a Model (Standard Providers)
Use this method for OpenAI, Anthropic, Google, or Cohere.- Go to Models → External Models → Add a model
- Select Easy Integration and click Next
- Choose your provider and click Next
- Select a model from the supported list
- Enter a Connection name and your API key
- Click Confirm
After Integration
- Enable/Disable: Use the Inference toggle to control model availability
- Edit: Click the three-dot menu to update credentials
- Delete: Remove the model via the three-dot menu
Add Amazon Bedrock Models
Amazon Bedrock requires IAM role-based authentication instead of API keys.Prerequisites
Create an IAM role in your AWS account that:- Grants permission to invoke Amazon Bedrock models
- Includes a trust policy allowing Agent Platform to assume the role
Steps
- Go to Models → External Models → Add a model
- Select Easy Integration → AWS Bedrock → Next
-
Configure Credentials:
Field Description IAM Role ARN Full ARN of your IAM role with Bedrock permissions Trusted Principal ARN Pre-populated; Platform’s AWS principal (read-only) -
Configure Model Details:
Field Description Model Name Internal name for identification Model ID Bedrock Model ID or Endpoint ID Region AWS region where the model is deployed Headers Optional custom headers for requests -
Configure Model Settings — Choose one option:
Option A: Default (Manual Configuration)
- Define prompt variables and custom variables
- Configure request body with variable placeholders (
{{prompt}}) - Test the response and map JSON paths for output extraction
- Select a provider template (OpenAI Chat Completions or Anthropic Messages)
- Enable supported features (Tool calling, Streaming, Structured response, etc.)
- Click Confirm to save, or Save as draft to configure later
Model Configuration Options
When adding Bedrock models or using advanced configuration, you can choose between two setup modes:Default Mode
Manually configure all API components:| Setting | Description |
|---|---|
| Variables | Define prompt variables and custom variables for dynamic binding |
| Request Body | JSON payload with {{variable}} placeholders |
| JSON Path Mapping | Extract output, input tokens, and output tokens from response |
- Output path:
choices[0].message.content - Input tokens:
usage.prompt_tokens - Output tokens:
usage.completion_tokens
Existing Provider Structures Mode
Automatically apply pre-defined schemas from known providers:| Provider Template | API Reference |
|---|---|
| OpenAI (Chat Completions) | Standard OpenAI chat format |
| Anthropic (Messages) | Standard Anthropic messages format |
| Google (Gemini) | Standard Gemini format |
- Structured response
- Tool calling
- Parallel tool calling
- Streaming
- Data generation
- Modalities (Text-to-Text, Text-to-Image, Image-to-Text, Audio-to-Text)