Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Note
This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, see Supplemental Terms of Use for Azure Previews.
A search index knowledge source specifies a connection to an Azure AI Search index that provides searchable content in an agentic retrieval pipeline. Knowledge sources are created independently, referenced in a knowledge base, and used as grounding data when an agent or chatbot calls a retrieve action at query time.
Prerequisites
Azure AI Search in any region that provides agentic retrieval. You must have semantic ranker enabled.
A search index containing plain text or vector content with a semantic configuration. Review the index criteria for agentic retrieval. The index must be on the same search service as the knowledge base.
The latest preview version of the
Azure.Search.Documentsclient library for the .NET SDK.Permission to create and use objects on Azure AI Search. We recommend role-based access, but you can use API keys if a role assignment isn't feasible. For more information, see Connect to a search service.
Note
Although you can use the Azure portal to create search index knowledge sources, the portal uses the 2025-08-01-preview, which uses the previous "knowledge agent" terminology and doesn't support all 2025-11-01-preview features. For help with breaking changes, see Migrate your agentic retrieval code.
Check for existing knowledge sources
A knowledge source is a top-level, reusable object. Knowing about existing knowledge sources is helpful for either reuse or naming new objects.
Run the following code to list knowledge sources by name and type.
// List knowledge sources by name and type
using Azure.Search.Documents.Indexes;
var indexClient = new SearchIndexClient(new Uri(searchEndpoint), credential);
var knowledgeSources = indexClient.GetKnowledgeSourcesAsync();
Console.WriteLine("Knowledge Sources:");
await foreach (var ks in knowledgeSources)
{
Console.WriteLine($" Name: {ks.Name}, Type: {ks.GetType().Name}");
}
You can also return a single knowledge source by name to review its JSON definition.
using Azure.Search.Documents.Indexes;
using System.Text.Json;
var indexClient = new SearchIndexClient(new Uri(searchEndpoint), credential);
// Specify the knowledge source name to retrieve
string ksNameToGet = "earth-knowledge-source";
// Get its definition
var knowledgeSourceResponse = await indexClient.GetKnowledgeSourceAsync(ksNameToGet);
var ks = knowledgeSourceResponse.Value;
// Serialize to JSON for display
var jsonOptions = new JsonSerializerOptions
{
WriteIndented = true,
DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.Never
};
Console.WriteLine(JsonSerializer.Serialize(ks, ks.GetType(), jsonOptions));
The following JSON is an example response for a search index knowledge source. Notice that the knowledge source specifies a single index name and which fields in the index to include in the query.
{
"SearchIndexParameters": {
"SearchIndexName": "earth-at-night",
"SourceDataFields": [
{
"Name": "id"
},
{
"Name": "page_chunk"
},
{
"Name": "page_number"
}
],
"SearchFields": [],
"SemanticConfigurationName": "semantic-config"
},
"Name": "earth-knowledge-source",
"Description": null,
"EncryptionKey": null,
"ETag": "<redacted>"
}
Create a knowledge source
Run the following code to create a search index knowledge source.
using Azure.Search.Documents.Indexes.Models;
// Create the knowledge source
var indexKnowledgeSource = new SearchIndexKnowledgeSource(
name: knowledgeSourceName,
searchIndexParameters: new SearchIndexKnowledgeSourceParameters(searchIndexName: indexName)
{
SourceDataFields = { new SearchIndexFieldReference(name: "id"), new SearchIndexFieldReference(name: "page_chunk"), new SearchIndexFieldReference(name: "page_number") }
}
);
await indexClient.CreateOrUpdateKnowledgeSourceAsync(indexKnowledgeSource);
Console.WriteLine($"Knowledge source '{knowledgeSourceName}' created or updated successfully.");
Source-specific properties
You can pass the following properties to create a search index knowledge source.
| Name | Description | Type | Editable | Required |
|---|---|---|---|---|
Name |
The name of the knowledge source, which must be unique within the knowledge sources collection and follow the naming guidelines for objects in Azure AI Search. | String | No | Yes |
Description |
A description of the knowledge source. | String | Yes | No |
EncryptionKey |
A customer-managed key to encrypt sensitive information in both the knowledge source and the generated objects. | Object | Yes | No |
SearchIndexParameters |
Parameters specific to search index knowledge sources: search_index_name, SemanticConfigurationName, SourceDataFields, and SearchFields. |
Object | Yes | Yes |
SearchIndexName |
The name of the existing search index. | String | Yes | Yes |
SemanticConfigurationName |
Overrides the default semantic configuration for the search index. | String | Yes | No |
SourceDataFields |
The index fields returned when you specify include_reference_source_data in the knowledge base definition. These fields are used for citations and should be retrievable. Examples include the document name, file name, page numbers, or chapter numbers. |
Array | Yes | No |
SearchFields |
The index fields to specifically search against. When unspecified, all fields are searched. | Array | Yes | No |
Assign to a knowledge base
If you're satisfied with the knowledge source, continue to the next step: specify the knowledge source in a knowledge base.
After the knowledge base is configured, use the retrieve action to query the knowledge source.
Delete a knowledge source
Before you can delete a knowledge source, you must delete any knowledge base that references it or update the knowledge base definition to remove the reference. For knowledge sources that generate an index and indexer pipeline, all generated objects are also deleted. However, if you used an existing index to create a knowledge source, your index isn't deleted.
If you try to delete a knowledge source that's in use, the action fails and returns a list of affected knowledge bases.
To delete a knowledge source:
Get a list of all knowledge bases on your search service.
# Get knowledge bases import requests import json endpoint = "{search_url}/knowledgebases" params = {"api-version": "2025-11-01-preview", "$select": "name"} headers = {"api-key": "{api_key}"} response = requests.get(endpoint, params = params, headers = headers) print(json.dumps(response.json(), indent = 2))An example response might look like the following:
{ "@odata.context": "https://my-search-service.search.azure.cn/$metadata#knowledgebases(name)", "value": [ { "name": "my-kb" }, { "name": "my-kb-2" } ] }Get an individual knowledge base definition to check for knowledge source references.
# Get a knowledge base definition import requests import json endpoint = "{search_url}/knowledgebases/{knowledge_base_name}" params = {"api-version": "2025-11-01-preview"} headers = {"api-key": "{api_key}"} response = requests.get(endpoint, params = params, headers = headers) print(json.dumps(response.json(), indent = 2))An example response might look like the following:
{ "name": "my-kb", "description": null, "retrievalInstructions": null, "answerInstructions": null, "outputMode": null, "knowledgeSources": [ { "name": "my-blob-ks", } ], "models": [], "encryptionKey": null, "retrievalReasoningEffort": { "kind": "low" } }Either delete the knowledge base or update the knowledge base to remove the knowledge source if you have multiple sources. This example shows deletion.
# Delete a knowledge base from azure.core.credentials import AzureKeyCredential from azure.search.documents.indexes import SearchIndexClient index_client = SearchIndexClient(endpoint = "search_url", credential = AzureKeyCredential("api_key")) index_client.delete_knowledge_base("knowledge_base_name") print(f"Knowledge base deleted successfully.")Delete the knowledge source.
# Delete a knowledge source from azure.core.credentials import AzureKeyCredential from azure.search.documents.indexes import SearchIndexClient index_client = SearchIndexClient(endpoint = "search_url", credential = AzureKeyCredential("api_key")) index_client.delete_knowledge_source("knowledge_source_name") print(f"Knowledge source deleted successfully.")
Note
This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, see Supplemental Terms of Use for Azure Previews.
In Azure AI Search, a knowledge base is a top-level object that orchestrates agentic retrieval. It defines which knowledge sources to query and the default behavior for retrieval operations. At query time, the retrieve method targets the knowledge base to run the configured retrieval pipeline.
A knowledge base specifies:
- One or more knowledge sources that point to searchable content.
- An optional LLM that provides reasoning capabilities for query planning and answer formulation.
- A retrieval reasoning effort that determines whether an LLM is invoked and manages cost, latency, and quality.
- Custom properties that control routing, source selection, output format, and object encryption.
After you create a knowledge base, you can update its properties at any time. If the knowledge base is in use, updates take effect on the next retrieval.
Important
2025-11-01-preview renames the 2025-08-01-preview knowledge agent to knowledge base. This is a breaking change. We recommend migrating existing code to the new APIs as soon as possible.
Prerequisites
Azure AI Search in any region that provides agentic retrieval. You must have semantic ranker enabled. If you're using a managed identity for role-based access to deployed models, your search service must be on the Basic pricing tier or higher.
Azure OpenAI with a supported LLM deployment.
One or more knowledge sources on your search service.
Permission to create and use objects on Azure AI Search. We recommend role-based access. Search Service Contributor can create and manage a knowledge base. Search Index Data Reader can run queries. Alternatively, you can use API keys if a role assignment isn't feasible. For more information, see Connect to a search service.
The latest preview version of the
azure-search-documentsclient library for Python.
Note
Although you can use the Azure portal to create knowledge bases, the portal uses the 2025-08-01-preview, which uses the previous "knowledge agent" terminology and doesn't support all 2025-11-01-preview features. For help with breaking changes, see Migrate your agentic retrieval code.
Supported models
Use one of the following LLMs from Azure OpenAI or an equivalent open-source model. For deployment instructions, see Deploy Azure OpenAI models with Azure Foundry.
gpt-4ogpt-4o-minigpt-4.1gpt-4.1-nanogpt-4.1-minigpt-5gpt-5-nanogpt-5-mini
Configure access
Azure AI Search needs access to the LLM from Azure OpenAI. We recommend Microsoft Entra ID for authentication and role-based access for authorization. You must be an Owner or User Access Administrator to assign roles. If roles aren't feasible, use key-based authentication instead.
On your model provider, such as Foundry Models, assign Cognitive Services User to the managed identity of your search service. If you're testing locally, assign the same role to your user account.
For local testing, follow the steps in Quickstart: Connect without keys to sign in to a specific subscription and tenant. Use
DefaultAzureCredentialinstead ofAzureKeyCredentialin each request, which should look similar to the following example:# Authenticate using roles from azure.identity import DefaultAzureCredential index_client = SearchIndexClient(endpoint = "search_url", credential = DefaultAzureCredential())
Important
Code snippets in this article use API keys. If you use role-based authentication, update each request accordingly. In a request that specifies both approaches, the API key takes precedence.
Check for existing knowledge bases
Knowing about existing knowledge bases is helpful for either reuse or naming new objects. Any 2025-08-01-preview knowledge agents are returned in the knowledge bases collection.
Run the following code to list existing knowledge bases by name.
# List knowledge bases by name
import requests
import json
endpoint = "{search_url}/knowledgebases"
params = {"api-version": "2025-11-01-preview", "$select": "name"}
headers = {"api-key": "{api_key}"}
response = requests.get(endpoint, params = params, headers = headers)
print(json.dumps(response.json(), indent = 2))
You can also return a single knowledge base by name to review its JSON definition.
# Get a knowledge base definition
import requests
import json
endpoint = "{search_url}/knowledgebases/{knowledge_base_name}"
params = {"api-version": "2025-11-01-preview"}
headers = {"api-key": "{api_key}"}
response = requests.get(endpoint, params = params, headers = headers)
print(json.dumps(response.json(), indent = 2))
The following JSON is an example response for a knowledge base.
{
"name": "my-kb",
"description": "A sample knowledge base.",
"retrievalInstructions": null,
"answerInstructions": null,
"outputMode": null,
"knowledgeSources": [
{
"name": "my-blob-ks"
}
],
"models": [],
"encryptionKey": null,
"retrievalReasoningEffort": {
"kind": "low"
}
}
Create a knowledge base
A knowledge base drives the agentic retrieval pipeline. In application code, it's called by other agents or chatbots.
A knowledge base connects knowledge sources (searchable content) to an LLM deployment from Azure OpenAI. Properties on the LLM establish the connection, while properties on the knowledge source establish defaults that inform query execution and the response.
Run the following code to create a knowledge base.
# Create a knowledge base
from azure.core.credentials import AzureKeyCredential
from azure.search.documents.indexes import SearchIndexClient
from azure.search.documents.indexes.models import KnowledgeBase, KnowledgeBaseAzureOpenAIModel, KnowledgeSourceReference, AzureOpenAIVectorizerParameters, KnowledgeRetrievalOutputMode, KnowledgeRetrievalLowReasoningEffort
index_client = SearchIndexClient(endpoint = "search_url", credential = AzureKeyCredential("api_key"))
aoai_params = AzureOpenAIVectorizerParameters(
resource_url = "aoai_endpoint",
deployment_name = "aoai_gpt_deployment",
model_name = "aoai_gpt_model",
)
knowledge_base = KnowledgeBase(
name = "my-kb",
description = "This knowledge base handles questions directed at two unrelated sample indexes.",
retrieval_instructions = "Use the hotels knowledge source for queries about where to stay, otherwise use the earth at night knowledge source.",
answer_instructions = "Provide a two sentence concise and informative answer based on the retrieved documents.",
output_mode = KnowledgeRetrievalOutputMode.ANSWER_SYNTHESIS,
knowledge_sources = [
KnowledgeSourceReference(name = "hotels-ks"),
KnowledgeSourceReference(name = "earth-at-night-ks"),
],
models = [KnowledgeBaseAzureOpenAIModel(azure_open_ai_parameters = aoai_params)],
encryption_key = None,
retrieval_reasoning_effort = KnowledgeRetrievalLowReasoningEffort,
)
index_client.create_or_update_knowledge_base(knowledge_base)
print(f"Knowledge base '{knowledge_base.name}' created or updated successfully.")
Knowledge base properties
You can pass the following properties to create a knowledge base.
| Name | Description | Type | Required |
|---|---|---|---|
name |
The name of the knowledge base, which must be unique within the knowledge bases collection and follow the naming guidelines for objects in Azure AI Search. | String | Yes |
description |
A description of the knowledge base. The LLM uses the description to inform query planning. | String | No |
retrieval_instructions |
A prompt for the LLM to determine whether a knowledge source should be in scope for a query, which is recommended when you have multiple knowledge sources. This field influences both knowledge source selection and query formulation. For example, instructions could append information or prioritize a knowledge source. Instructions are passed directly to the LLM, which means it's possible to provide instructions that break query planning, such as instructions that result in bypassing an essential knowledge source. | String | Yes |
answer_instructions |
Custom instructions to shape synthesized answers. The default is null. For more information, see Use answer synthesis for citation-backed responses. | String | Yes |
output_mode |
Valid values are answer_synthesis for an LLM-formulated answer or extracted_data for full search results that you can pass to an LLM as a downstream step. |
String | Yes |
knowledge_sources |
One or more supported knowledge sources. | Array | Yes |
models |
A connection to a supported LLM used for answer formulation or query planning. In this preview, models can contain just one model, and the model provider must be Azure OpenAI. Obtain model information from the Foundry portal or a command-line request. You can use role-based access control instead of API keys for the Azure AI Search connection to the model. For more information, see How to deploy Azure OpenAI models with Foundry. |
Object | No |
encryption_key |
A customer-managed key to encrypt sensitive information in both the knowledge base and the generated objects. | Object | No |
retrieval_reasoning_effort |
Determines the level of LLM-related query processing. Valid values are minimal, low (default), and medium. For more information, see Set the retrieval reasoning effort. |
Object | No |
Query a knowledge base
Call the retrieve action on the knowledge base to verify the LLM connection and return results. For more information about the retrieve request and response schema, see Retrieve data using a knowledge base in Azure AI Search.
Replace "Where does the ocean look green?" with a query string that's valid for your knowledge sources.
# Send grounding request
from azure.core.credentials import AzureKeyCredential
from azure.search.documents.knowledgebases import KnowledgeBaseRetrievalClient
from azure.search.documents.knowledgebases.models import KnowledgeBaseMessage, KnowledgeBaseMessageTextContent, KnowledgeBaseRetrievalRequest, RemoteSharePointKnowledgeSourceParams
kb_client = KnowledgeBaseRetrievalClient(endpoint = "search_url", knowledge_base_name = "knowledge_base_name", credential = AzureKeyCredential("api_key"))
request = KnowledgeBaseRetrievalRequest(
messages=[
KnowledgeBaseMessage(
role = "assistant",
content = [KnowledgeBaseMessageTextContent(text = "Use the earth at night index to answer the question. If you can't find relevant content, say you don't know.")]
),
KnowledgeBaseMessage(
role = "user",
content = [KnowledgeBaseMessageTextContent(text = "Where does the ocean look green?")]
),
],
knowledge_source_params=[
SearchIndexKnowledgeSourceParams(
knowledge_source_name = "earth-at-night-ks",
include_references = True,
include_reference_source_data = True,
always_query_source = False,
)
],
include_activity = True,
)
result = kb_client.retrieve(request)
print(result.response[0].content[0].text)
Key points:
messagesis required, but you can run this example using just theuserrole that provides the query.knowledge_source_paramsspecifies one or more query targets. For each knowledge source, you can specify how much information to include in the output.
The response to the sample query might look like the following example:
"response": [
{
"content": [
{
"type": "text",
"text": "The ocean appears green off the coast of Antarctica due to phytoplankton flourishing in the water, particularly in Granite Harbor near Antarctica's Ross Sea, where they can grow in large quantities during spring, summer, and even autumn under the right conditions [ref_id:0]. Additionally, off the coast of Namibia, the ocean can also look green due to blooms of phytoplankton and yellow-green patches of sulfur precipitating from bacteria in oxygen-depleted waters [ref_id:1]. In the Strait of Georgia, Canada, the waters turned bright green due to a massive bloom of coccolithophores, a type of phytoplankton [ref_id:5]. Furthermore, a milky green and blue bloom was observed off the coast of Patagonia, Argentina, where nutrient-rich waters from different currents converge [ref_id:6]. Lastly, a large bloom of cyanobacteria was captured in the Baltic Sea, which can also give the water a green appearance [ref_id:9]."
}
]
}
]
Delete a knowledge base
If you no longer need the knowledge base or need to rebuild it on your search service, use this request to delete the object.
# Delete a knowledge base
from azure.core.credentials import AzureKeyCredential
from azure.search.documents.indexes import SearchIndexClient
index_client = SearchIndexClient(endpoint = "search_url", credential = AzureKeyCredential("api_key"))
index_client.delete_knowledge_base("knowledge_base_name")
print(f"Knowledge base deleted successfully.")
Note
This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, see Supplemental Terms of Use for Azure Previews.
In Azure AI Search, a knowledge base is a top-level object that orchestrates agentic retrieval. It defines which knowledge sources to query and the default behavior for retrieval operations. At query time, the retrieve method targets the knowledge base to run the configured retrieval pipeline.
A knowledge base specifies:
- One or more knowledge sources that point to searchable content.
- An optional LLM that provides reasoning capabilities for query planning and answer formulation.
- A retrieval reasoning effort that determines whether an LLM is invoked and manages cost, latency, and quality.
- Custom properties that control routing, source selection, output format, and object encryption.
After you create a knowledge base, you can update its properties at any time. If the knowledge base is in use, updates take effect on the next retrieval.
Important
2025-11-01-preview renames the 2025-08-01-preview knowledge agent to knowledge base. This is a breaking change. We recommend migrating existing code to the new APIs as soon as possible.
Prerequisites
Azure AI Search in any region that provides agentic retrieval. You must have semantic ranker enabled. If you're using a managed identity for role-based access to deployed models, your search service must be on the Basic pricing tier or higher.
Azure OpenAI with a supported LLM deployment.
One or more knowledge sources on your search service.
Permission to create and use objects on Azure AI Search. We recommend role-based access. Search Service Contributor can create and manage a knowledge base. Search Index Data Reader can run queries. Alternatively, you can use API keys if a role assignment isn't feasible. For more information, see Connect to a search service
The latest preview version of the
Azure.Search.Documentsclient library for the .NET SDK.
Note
Although you can use the Azure portal to create knowledge bases, the portal uses the 2025-08-01-preview, which uses the previous "knowledge agent" terminology and doesn't support all 2025-11-01-preview features. For help with breaking changes, see Migrate your agentic retrieval code.
Supported models
Use one of the following LLMs from Azure OpenAI or an equivalent open-source model. For deployment instructions, see Deploy Azure OpenAI models with Azure Foundry.
gpt-4ogpt-4o-minigpt-4.1gpt-4.1-nanogpt-4.1-minigpt-5gpt-5-nanogpt-5-mini
Configure access
Azure AI Search needs access to the LLM from Azure OpenAI. We recommend Microsoft Entra ID for authentication and role-based access for authorization. You must be an Owner or User Access Administrator to assign roles. If roles aren't feasible, use key-based authentication instead.
On your model provider, such as Foundry Models, assign Cognitive Services User to the managed identity of your search service. If you're testing locally, assign the same role to your user account.
For local testing, follow the steps in Quickstart: Connect without keys to get a personal access token for a specific subscription and tenant. Specify your access token in each request, which should look similar to the following example:
# List indexes using roles GET https://{{search-url}}/indexes?api-version=2025-11-01-preview Content-Type: application/json Authorization: Bearer {{access-token}}
Important
Code snippets in this article use API keys. If you use role-based authentication, update each request accordingly. In a request that specifies both approaches, the API key takes precedence.
Check for existing knowledge bases
A knowledge base is a top-level, reusable object. Knowing about existing knowledge bases is helpful for either reuse or naming new objects. Any 2025-08-01-preview knowledge agents are returned in the knowledge bases collection.
Use Knowledge Bases - List (REST API) to list knowledge bases by name and type.
# List knowledge bases
GET {{search-url}}/knowledgebases?api-version=2025-11-01-preview&$select=name
Content-Type: application/json
api-key: {{search-api-key}}
You can also return a single knowledge base by name to review its JSON definition.
# Get knowledge base
GET {{search-url}}/knowledgebases/{{knowledge-base-name}}?api-version=2025-11-01-preview
Content-Type: application/json
api-key: {{search-api-key}}
The following JSON is an example response for a knowledge base.
{
"name": "my-kb",
"description": "A sample knowledge base.",
"retrievalInstructions": null,
"answerInstructions": null,
"outputMode": null,
"knowledgeSources": [
{
"name": "my-blob-ks"
}
],
"models": [],
"encryptionKey": null,
"retrievalReasoningEffort": {
"kind": "low"
}
}
Create a knowledge base
A knowledge base drives the agentic retrieval pipeline. In application code, it's called by other agents or chatbots.
A knowledge base connects knowledge sources (searchable content) to an LLM deployment from Azure OpenAI. Properties on the LLM establish the connection, while properties on the knowledge source establish defaults that inform query execution and the response.
Use Knowledge Bases - Create or Update (REST API) to formulate the request.
# Create a knowledge base
PUT {{search-url}}/knowledgebases/{{knowledge-base-name}}?api-version=2025-11-01-preview
Content-Type: application/json
api-key: {{search-api-key}}
{
"name" : "my-kb",
"description": "This knowledge base handles questions directed at two unrelated sample indexes.",
"retrievalInstructions": "Use the hotels knowledge source for queries about where to stay, otherwise use the earth at night knowledge source.",
"answerInstructions": null,
"outputMode": "answerSynthesis",
"knowledgeSources": [
{
"name": "hotels-ks"
},
{
"name": "earth-at-night-ks"
}
],
"models" : [
{
"kind": "azureOpenAI",
"azureOpenAIParameters": {
"resourceUri": "{{model-provider-url}}",
"apiKey": "{{model-api-key}}",
"deploymentId": "gpt-4.1-mini",
"modelName": "gpt-4.1-mini"
}
}
],
"encryptionKey": null,
"retrievalReasoningEffort": {
"kind": "low"
}
}
Knowledge base properties
You can pass the following properties to create a knowledge base.
| Name | Description | Type | Required |
|---|---|---|---|
name |
The name of the knowledge base, which must be unique within the knowledge bases collection and follow the naming guidelines for objects in Azure AI Search. | String | Yes |
description |
A description of the knowledge base. The LLM uses the description to inform query planning. | String | No |
retrievalInstructions |
A prompt for the LLM to determine whether a knowledge source should be in scope for a query, which is recommended when you have multiple knowledge sources. This field influences both knowledge source selection and query formulation. For example, instructions could append information or prioritize a knowledge source. Instructions are passed directly to the LLM, which means it's possible to provide instructions that break query planning, such as instructions that result in bypassing an essential knowledge source. | String | Yes |
answerInstructions |
Custom instructions to shape synthesized answers. The default is null. For more information, see Use answer synthesis for citation-backed responses. | String | Yes |
outputMode |
Valid values are answerSynthesis for an LLM-formulated answer or extractedData for full search results that you can pass to an LLM as a downstream step. |
String | Yes |
knowledgeSources |
One or more supported knowledge sources. | Array | Yes |
models |
A connection to a supported LLM used for answer formulation or query planning. In this preview, models can contain just one model, and the model provider must be Azure OpenAI. Obtain model information from the Foundry portal or a command-line request. You can use role-based access control instead of API keys for the Azure AI Search connection to the model. For more information, see How to deploy Azure OpenAI models with Foundry. |
Object | No |
encryptionKey |
A customer-managed key to encrypt sensitive information in both the knowledge base and the generated objects. | Object | No |
retrievalReasoningEffort.kind |
Determines the level of LLM-related query processing. Valid values are minimal, low (default), and medium. For more information, see Set the retrieval reasoning effort. |
Object | No |
Query a knowledge base
Call the retrieve action on the knowledge base to verify the LLM connection and return results. For more information about the retrieve request and response schema, see Retrieve data using a knowledge base in Azure AI Search.
Use Knowledge Retrieval - Retrieve (REST API) to formulate the request. Replace "Where does the ocean look green?" with a query string that's valid for your knowledge sources.
# Send grounding request
POST {{search-url}}/knowledgebases/{{knowledge-base-name}}/retrieve?api-version=2025-11-01-preview
Content-Type: application/json
api-key: {{search-api-key}}
{
"messages" : [
{ "role" : "assistant",
"content" : [
{ "type" : "text", "text" : "Use the earth at night index to answer the question. If you can't find relevant content, say you don't know." }
]
},
{
"role" : "user",
"content" : [
{
"text" : "Where does the ocean look green?",
"type" : "text"
}
]
}
],
"includeActivity": true,
"knowledgeSourceParams": [
{
"knowledgeSourceName": "earth-at-night-ks",
"kind": "searchIndex",
"includeReferences": true,
"includeReferenceSourceData": true,
"alwaysQuerySource": false
}
]
}
Key points:
messagesis required, but you can run this example using just theuserrole that provides the query.knowledgeSourceParamsspecifies one or more query targets. For each knowledge source, you can specify how much information to include in the output.
The response to the sample query might look like the following example:
"response": [
{
"content": [
{
"type": "text",
"text": "The ocean appears green off the coast of Antarctica due to phytoplankton flourishing in the water, particularly in Granite Harbor near Antarctica's Ross Sea, where they can grow in large quantities during spring, summer, and even autumn under the right conditions [ref_id:0]. Additionally, off the coast of Namibia, the ocean can also look green due to blooms of phytoplankton and yellow-green patches of sulfur precipitating from bacteria in oxygen-depleted waters [ref_id:1]. In the Strait of Georgia, Canada, the waters turned bright green due to a massive bloom of coccolithophores, a type of phytoplankton [ref_id:5]. Furthermore, a milky green and blue bloom was observed off the coast of Patagonia, Argentina, where nutrient-rich waters from different currents converge [ref_id:6]. Lastly, a large bloom of cyanobacteria was captured in the Baltic Sea, which can also give the water a green appearance [ref_id:9]."
}
]
}
]
Delete a knowledge base
If you no longer need the knowledge base or need to rebuild it on your search service, use this request to delete the object.
# Delete a knowledge base
DELETE {{search-url}}/knowledgebases/{{knowledge-base-name}}?api-version=2025-11-01-preview
api-key: {{search-api-key}}