Step 1: Enable Agentic RAG
Conversational context is processed by the Query Rephrasing Agent, which is part of the Agentic RAG pipeline. You must enable this before passing the conversation history.- Go to Search AI > Agentic RAG.
- Enable Agentic RAG. Learn More.
- Configure the LLM model and prompt to be used for query rephrasing. Learn more.
Each enabled Agentic RAG stage adds latency to the search request. Disable other stages if not required.
Step 2: Pass the Conversation History in the API Request
Use thecustomData.previousConversation field in the Advanced Search API request body to pass the prior conversation as context.
The previousConversation array contains objects with the previous queries and their answers. The entire array is stringified and passed directly to the LLM , so the model has full context of the exchange when rephrasing the current query.
Request example
Passing Additional Context in the Same Array
You can include additional contextual objects alongside the Q&A entries in thepreviousConversation array. For example, if your application generates a conversation summary, you can pass it as a separate object in the same array.
The previousConversation array must be a valid JSON object. It can include any number of sub-objects.
Points to Note
- The
previousConversationfield is processed by the Query Rephrasing Agent. Agentic RAG must be enabled for the context to take effect. If Agentic RAG is disabled, the field is ignored. - There is no enforced limit on the number of entries in the array, but keep in mind that very long conversation histories increase the token count sent to the LLM, which can affect latency and cost.
- You can also use
customDatato pass other context, such as user identity or location. These can coexist withpreviousConversationin the samecustomDataobject. Learn more.