Configure the language models and business rules that power AI for Work.
This guide covers adding pre-built and custom LLM integrations, setting up embedding models, and defining answering and entity rules.
General Purpose LLMs
AI for Work integrates with OpenAI, Azure OpenAI, and Google Gemini out of the box, and supports custom models via API endpoint. All configurations are managed from the Admin Console.
Supported Providers
| Provider | Available Models | Access Method | Best For |
|---|
| OpenAI | GPT-5-mini, GPT-5-nano, GPT-5, GPT-4.1, GPT-4.1-mini, GPT-4o, GPT-3.5-turbo, o3, o3-mini, o4-mini | OpenAI API key | General-purpose tasks, conversational AI, complex reasoning, content generation |
| Azure OpenAI | GPT-5-mini, GPT-5-nano, GPT-5, GPT-4.1, GPT-4.1-mini, GPT-4o, GPT-3.5-turbo, o3, o3-mini, o4-mini | Azure Portal / Azure OpenAI Service | Enterprise deployments requiring Azure infrastructure, compliance, and enhanced security |
| Google Gemini | Gemini 2.5 Pro (Recommended), Gemini 2.5 Flash (Recommended), Gemini 2.5 Flash Lite, Gemini 2.0 Flash, Gemini 2.0 Flash Lite | Google Vertex AI / Gemini Studio | Multi-modal tasks, fast responses, Google Cloud deployments |
Model Tiers
Select a tier based on task complexity and cost requirements.
| Tier | Best For | Example Models |
|---|
| Basic | High-volume, straightforward tasks — classification, simple Q&A, routine inquiries | GPT-4o-mini, Gemini 2.5 Flash, Gemini 2.0 Flash |
| Standard | Complex reasoning, multi-step workflows, deeper contextual understanding — recommended default for all system prompts and orchestrators | GPT-4.1, GPT-4o, Gemini 2.5 Pro |
| Premium | Advanced reasoning and complex problem-solving requiring maximum AI capability | o3, o3-mini, o4-mini |
Configuration
Pre-Built LLM
-
Go to Admin Console > Assist Configuration > General Purpose.
-
Click New and select a provider: OpenAI, Azure OpenAI, or Google Gemini.
-
Enter the required details:
- Integration Name — A unique identifier (for example,
OpenAI-Production).
- API Key — Your provider API key.
- Model Name — Select from the dropdown.
-
Review and accept the Policy Guidelines, then click Save.
-
Confirm the integration appears as active in the General LLM Integrations list.
Provider-specific notes
| Provider | Notes |
|---|
| OpenAI | Get your API key from platform.openai.com. Rotate keys regularly and monitor usage for cost control. |
| Azure OpenAI | In the Azure Portal, open your Azure OpenAI resource and go to Keys and Endpoints. Copy the endpoint URL and an API key. Confirm your subscription has the required quotas; configure VNet rules if needed. |
| Google Gemini | Enable the Vertex AI API in Google Cloud Console or use Gemini Studio. Create a service account, generate credentials, and verify that selected models are enabled in Google Model Garden. Configure project-level billing before use. |
Custom LLM
Use this option for proprietary or self-hosted models exposed via API.
-
Go to Admin Console > Assist Configuration > General Purpose.
-
Click New and select Custom LLM.
-
Enter the basic configuration:
- Integration Name — A descriptive name for this integration.
- Model Name — The model identifier.
- Endpoint URL — The full API URL where your model is hosted.
-
Configure API settings:
- Method — Select the HTTP method (typically
POST).
- Max Request Tokens — Set a token limit to control cost and response size.
-
Set up authentication:
- Auth Type — Choose API Key, Bearer Token, or Custom Header.
- Enter the credentials required by your model.
-
Add custom headers if required:
- Click + Add a Header and enter key-value pairs (for example,
Content-Type: application/json).
-
Test the connection:
- Enter a sample payload and click Test.
- Success confirms connectivity; Error returns details to help you troubleshoot.
-
Accept the Policy Guidelines and click Save.
-
Confirm the integration appears as active in the General LLM Integrations list.
Embedding Models
Embedding models convert text into vector representations, enabling semantic search, similarity matching, and other AI-powered features. Unlike keyword search, embeddings capture meaning and context, producing more intelligent results.
Both pre-built and custom embedding models are supported.
Embedding models are required for the attachments feature, which lets users attach files and ask questions about their content.
Managing Integrations
View active integrations
All configured models appear in the General LLM Integrations list.
| Field | Description |
|---|
| Integration Name | Identifier for the configuration |
| Provider | OpenAI, Azure OpenAI, Gemini, or Custom |
| Model Name | Specific model version in use |
| Status | Active, inactive, or error |
| Last Updated | Timestamp of the most recent change |
Modify an integration
- Locate the integration in the General LLM Integrations list.
- Click the edit icon or the integration name.
- Update the required fields — API key, model version, headers, and so on.
- Test the connection, then click Save.
Remove an integration
- Locate the integration in the General LLM Integrations list.
- Click Delete or Deactivate.
- Confirm when prompted.
- Reconfigure any dependent features to use an alternative model.
Business Rules
Business rules control how AI for Work selects entities and generates responses when specific keywords appear in user input. Use them to ensure answers align with organizational policies and goals.
Access business rules at Admin Console > Business Rules. Rules from all published custom integrations in the account appear here.
Two rule types are available:
Answering Rules
Answering rules define the response the system returns when a user query contains specific keywords. When a keyword is detected, the system displays the answer you configured instead of generating one dynamically.
| Field | Description |
|---|
| Connection | Toggle on to make the rule connection-specific — it triggers only when both the bot connection and keyword match. Toggle off for keyword-only matching, independent of any connection. |
| Rule | Enter the question and specify the answer the system should display. |
Entity Rules
Entity rules prefill entities in a query when the system detects specific keywords within a matching connection intent. This automates data entry and streamlines interactions that depend on predefined values.
| Field | Description |
|---|
| Connection | Select a custom connection from the available list. |
| Rule | Write the rule in detail, then click Build Flow to proceed. |