Generative AI Tools lets you integrate large language models (LLMs) into AI for Service apps. You can use pre-built integrations, bring your own model, or deploy Kore.ai’s fine-tuned XO GPT models—and apply them across Automation AI, Search AI, Agent AI, Contact Center AI, and Quality AI.
Overview
To access Generative AI Tools, go to Product Switcher > Generative AI Tools, or click the icon in the left-side menu.
Integration Options
| Option | Description |
|---|
| Pre-built Integrations | Connect to OpenAI, Azure OpenAI, or Anthropic using pre-configured prompt templates. Supports newly launched models with the required authentication. You can also create custom prompts within these integrations. |
| Bring Your Own Model (BYO) | Integrate externally hosted or self-hosted enterprise models. Works with the Platform’s Auth Profiles module so you can use your preferred authentication mechanism. |
| Kore.ai XO GPT | Fine-tuned LLMs built for enterprise conversational AI—evaluated for accuracy, safety, and production readiness. See Kore.ai XO GPT. |
Key Features
| Feature | Description |
|---|
| Models Library | Connect to any LLM via pre-built integrations, custom integrations, or XO GPT. |
| Prompts Library | Create fully customized prompts optimized for your use case and model. |
| Automation AI | Auto dialog generation, utterance suggestions, conversation summaries, and more. See Automation AI Features. |
| Search AI | AI-powered enterprise search. See Search AI Features. |
| Agent AI | GenAI features for agent-facing workflows. See Agent AI Features. |
| Contact Center AI | GenAI features for contact center operations. See Contact Center AI Features. |
| Quality AI | AI-driven quality monitoring and analysis. See Quality AI Features. |
| Data Anonymization | Mask PII and sensitive data before it reaches the LLM. See Data Anonymization. |
| Guardrails | Enforce safety and appropriateness of LLM responses. See Guardrails. |
Key Benefits
| Benefit | Description |
|---|
| Flexible LLM choice | Use XO GPT, a pre-built integration, or your own custom model—no lock-in. |
| Faster development | Auto-generate dialog tasks and training utterances; create dialog flows on the fly. |
| Custom GenAI use cases | Write your own prompts to build capabilities beyond what pre-built integrations expose. |
| Productivity automation | Automate repetitive tasks (dialog generation, utterance suggestions) so you focus on conversation design and testing. |
| Smarter testing | Get AI suggestions for test cases, probable user inputs, and error scenarios directly in Conversation Testing. See Conversation Testing. |
| Guided onboarding | The platform offers curated use-case suggestions when creating an AI Agent. |
Free Tokens
New apps in Standard workspaces receive free LLM tokens powered by OpenAI GPT-4o (Azure) via Kore.ai’s enterprise account. This lets you explore GenAI features immediately—no LLM setup required.
Free tokens are only available for apps created in Standard workspaces.
Claim free tokens
Select the free tokens icon in the platform header. Once claimed, all GenAI features activate using OpenAI GPT-4o (Azure).
Automation AI features available with free tokens
| Feature | Description |
|---|
| Automatic Dialog Generation | Generate complete dialog flows from minimal input. |
| Conversation Test Case Suggestions | Get AI-suggested test cases for robust conversations. |
| Conversation Summary | Auto-generate conversation summaries for quick review. |
| NLP Batch Test Case Suggestions | Bulk-generate NLP test case suggestions. |
| Training Utterance Suggestions | Get smart suggestions to improve NLP training. |
| Similar Utterance Suggestions | Broaden coverage with variations on existing utterances. |
| Opposite Utterance Suggestions | Cover contrasting scenarios with opposite-intent utterances. |
| Answer Generation | Generate accurate, contextually appropriate answers. |
| GenAI Node | Implement complex AI-driven logic in dialog flows. |
| GenAI Prompt | Craft and refine LLM prompts within dialogs. |
| Repeat Responses | Ensure consistent answers for repeated queries. |
| Rephrase Dialog Responses | Add natural variation to dialog responses automatically. |
See all GenAI features.
Security note
Do not upload sensitive documents while using free tokens. Free tokens are for exploration and testing only—switch to your own LLM account before handling confidential data.
Transition to your own LLM
When your free tokens run out:
- Go to Generative AI Tools > Models Library.
- Select a preferred LLM provider.
- Obtain an API key from that provider.
- Enter the API key and save.
- Go to GenAI Features and enable the features you need.
Getting Started
- Integrate a pre-built, custom LLM, or XO GPT model in Models Library.
- Create prompts in the Prompts Library.
- Enable GenAI Features.
- (Optional) Enable Data Anonymization and Guardrails.
Important Considerations
| Topic | Details |
|---|
| Language support | LLMs generate responses in any app language that the LLM itself supports. See Managing Languages. |
| Data sharing | GenAI features send data to third-party LLM providers (OpenAI, Azure OpenAI, Anthropic, and others). Review your provider’s data handling policies before enabling features in production. |
| Free token security | Do not upload sensitive documents while on free tokens. Switch to your own LLM account before handling confidential or regulated data. |