NLP Engines
| Engine | Description |
|---|---|
| Fundamental Meaning (FM) | Computational linguistics built on ChatScript. Analyzes word meaning, position, conjugation, capitalization, and sentence structure. |
| Machine Learning (ML) | Trains on example utterances; learns and generalizes to recognize similar inputs. |
| Knowledge Graph (KG) | Converts FAQ content into structured conversational responses. |
| Traits Engine | Multi-class classifier identifying characteristics in utterances to refine intent detection. |
| Small Talk Engine | Handles conversational pleasantries to make interactions feel natural. |
| Ranking and Resolver (R&R) | Scores and ranks results from all engines to determine the winning intent. |
Conversation Flow
- NLP Analysis — User input passes through all NLP engines for intent detection and entity extraction.
- Task Execution — The winning intent executes. The conversation engine maintains state (user details, previous intents, context).
- Preconditions — If required conditions aren’t met, the intent is rejected.
- Negative patterns — Patterns that prevent incorrect intent matches.
- Event handling — Welcome messages, sentiment, etc.
- Interruption Handling — Manages mid-task intent switches and sentiment-triggered agent transfers.
- Response Generation — A response is generated and rendered for the user’s channel.
NLP Preprocessing
Before intent detection, each utterance undergoes:| Step | Description |
|---|---|
| Tokenization | Splits utterance into sentences, then words. Uses TreeBank Tokenizer for English. |
| toLower() | Converts to lowercase. Not applied to German (word meaning changes by case). ML and KG only. |
| Stop word removal | Removes low-signal words. Language-specific; disabled by default. ML and KG only. |
| Stemming | Cuts words to their stem (e.g., “Working” → “work”). Output may not be a valid word. |
| Lemmatization | Converts to base form using a dictionary (e.g., “housing” → “house”). |
| N-grams | Combines co-occurring words (e.g., “New York City”) for richer context. |
Choosing an NLP Engine
| Scenario | Recommended Engine |
|---|---|
| Large corpus per intent | ML — flexible, auto-learns from examples. Corpus of 200–300 for distinct intents; 1000+ for similar intents. |
| Query/FAQ-type intents or document-based answers | KG — semantic matching for knowledge content. |
| Idiomatic, command-like sentences, or tolerance for false positives | FM — deterministic, rule-based. |
NLP Configuration in the Platform
Go to Automation > Natural Language:| Section | Purpose |
|---|---|
| Training | Add ML utterances, synonyms, concepts, and patterns. |
| NLU Config | Set confidence thresholds, engine tuning, and advanced settings. |