Overview
External models are AI models hosted outside the platform. Once connected, they can be used across Agent Platform in Agentic Apps, Prompt Studio, Tools, and Evaluation Studio. Supported Providers (Easy Integration):| Provider | Authentication | Tool Calling |
|---|---|---|
| OpenAI | API Key | ✓ |
| Anthropic | API Key | ✓ |
| API Key | ✓ | |
| Cohere | API Key | ✓ |
| Azure OpenAI | API Key + Endpoint | ✓ |
| Amazon Bedrock | IAM Role ARN | ✓ |
Manage Connected Models
View Models
- Go to Models → External Models to see all connected models.
Manage Connections
Each model can have multiple connections with different API keys, enabling separate usage tracking and billing.| Action | Description |
|---|---|
| Inference Toggle | Enable/disable model availability across platform |
| Edit | Update API key or credentials |
| Delete | Remove the connection |
Add a Model via Easy Integration
Use Easy Integration for commercial providers with API keys or IAM roles.Standard Providers (OpenAI, Anthropic, Google, Cohere)
- Go to Models → External Models → Add a model
- Select Easy Integration → click Next
- Choose your provider → click Next
- Select a model from the supported list
- Enter a Connection name and your API key
- Click Confirm
Amazon Bedrock
Bedrock uses IAM role-based authentication instead of API keys. Prerequisites: Create an IAM role in AWS with Bedrock permissions and a trust policy allowing Agent Platform to assume the role. See Configuring Amazon Bedrock for IAM setup. Steps:- Go to Models → External Models → Add a model
- Select Easy Integration → AWS Bedrock → Next
- Configure credentials and model details:
| Field | Description |
|---|---|
| IAM Role ARN | Your IAM role with Bedrock permissions |
| Trusted Principal ARN | Platform’s AWS principal (pre-populated) |
| Model Name | Internal identifier |
| Model ID | Bedrock Model ID or Endpoint ID |
| Region | AWS region of the model |
| Headers | Optional custom headers |
- Configure model settings using Default or Existing Provider Structures
- Click Confirm
Add a Model via API Integration
Use API Integration for custom endpoints or self-hosted models.Note: For Agentic Apps compatibility, custom models must support tool calling and follow OpenAI or Anthropic request/response structures.
Steps
- Go to Models → External Models → Add a model
- Select Custom Integration → click Next
- Enter basic configuration:
| Field | Description |
|---|---|
| Connection Name | Unique identifier |
| Model Endpoint URL | Full API endpoint URL |
| Authorization Profile | Select configured auth profile or None |
| Headers | Optional key-value pairs for requests |
- Configure model settings using Default or Existing Provider Structures
- Click Confirm
Model Configuration Modes
When using API Integration or advanced Bedrock setup, choose one of these configuration modes:Default Mode
Manually configure request/response handling for complete control. 1. Define Variables| Variable Type | Description |
|---|---|
| Prompt | Primary input text (required) |
| System Prompt | System instructions (optional) |
| Examples | Few-shot examples (optional) |
| Custom Variables | Additional dynamic inputs with name, display name, and data type |
{{variable}} placeholders:
| Field | Description | Example |
|---|---|---|
| Output Path | Location of generated text | choices[0].message.content |
| Input Tokens | Input token count | usage.prompt_tokens |
| Output Tokens | Output token count | usage.completion_tokens |
Existing Provider Structures Mode
Automatically apply pre-defined schemas from known providers. Recommended when your model follows a standard API format. 1. Select Provider Template| Template | Use When |
|---|---|
| OpenAI (Chat Completions) | Model follows OpenAI chat API format |
| Anthropic (Messages) | Model follows Anthropic messages API format |
| Google (Gemini) | Model follows Gemini API format |
| Feature | Description |
|---|---|
| Structured Response | JSON-formatted outputs for Prompts and Tools |
| Tool Calling | Function calling for Agentic Apps and AI nodes |
| Parallel Tool Calling | Multiple tool calls per request |
| Streaming | Real-time token generation for Agentic Apps |
| Data Generation | Synthetic data generation in Prompt Studio |
| Modalities | Text-to-Text, Text-to-Image, Image-to-Text, Audio-to-Text |
Warning: Enabling unsupported features may cause unexpected behavior.
Troubleshooting
| Issue | Solution |
|---|---|
| Test fails | Verify endpoint URL and authentication |
| Empty response | Check JSON path mapping matches response structure |
| Model not in dropdowns | Ensure Inference toggle is ON |
| Tool calling not working | Verify model supports it and feature is enabled |
| Bedrock connection fails | Check IAM role ARN and trust policy configuration |
Related
- Supported Models — Complete list of supported models
- Configuring Amazon Bedrock — IAM role setup for AWS
- Authorization Profiles — Configure authentication