Use an Assistant MCP server
Every Pinecone Assistant has a dedicated MCP server that gives AI agents direct access to the assistant’s knowledge through the standardized Model Context Protocol (MCP). This page shows you how to connect an assistant’s MCP server with Cursor, Claude Desktop, and LangChain.
There are two ways to connect to an assistant MCP server:
- Remote MCP server - Use a dedicated MCP endpoint to connect directly to an assistant.
- Local MCP server - Run a Docker container locally that connects to an assistant
Both options support a context tool that allows agents to retrieve relevant context snippets from your assistant’s knowledge. This is similar to the context API but fine-tuned for MCP clients. Additional capabilities, such as file access, will be added in future releases.
Remote MCP server
Every Pinecone Assistant has a dedicated MCP endpoint that you can connect directly to your AI applications. This option doesn’t require running any infrastructure and is managed by Pinecone.
The MCP endpoint for an assistant is:
Prerequisites
- A Pinecone API key. You can create a new key in the Pinecone console.
- Your assistant’s MCP endpoint. To find it, go to your assistant in the Pinecone console. You’ll see the assistant MCP endpoint in the sidebar.
Use with LangChain
You can use the LangChain MCP client to integrate with LangChain to create a powerful multi-agent workflow.
For example, the following code integrates Langchain with two assistants, one called ai-news
and the other called industry-reports
:
Use with Claude Desktop
You can configure Claude Desktop to use your assistant’s remote MCP server. However, at this early stage of remote MCP server adoption, the Claude Desktop application does not support remote server URLs. In the example below, we work around this by using a local proxy server, supergateway, to forward requests to the remote MCP server with your API key.
supergateway is an open-source third-party tool. Use at your own risk.
-
Open Claude Desktop and go to Settings.
-
On the Developer tab, click Edit Config to open the configuration file.
-
Add the following configuration:
Replace
<YOUR_PINECONE_API_KEY>
with your Pinecone API key and<YOUR_PINECONE_ASSISTANT_HOST>
with your Pinecone Assistant host. -
Save the configuration file and restart Claude Desktop.
-
From the new chat screen, you should see a hammer (MCP) icon appear with the new MCP server available.
Local MCP server
Pinecone provides an open-source Pinecone Assistant MCP server that you can run locally with Docker. This option is useful for development, testing, or when you want to run the MCP server within your own infrastructure or expand the MCP server to include additional capabilities.
For the most up-to-date information on the local MCP server, see the Pinecone Assistant MCP server repository.
Prerequisites
- Docker is installed and running on your system.
- A Pinecone API key. You can create a new key in the Pinecone console.
- Your Pinecone Assistant host. To find it, go to your assistant in the Pinecone console. You’ll see the assistant Host in the sidebar.
Start the MCP server
Download the assistant-mcp
Docker image:
Start the MCP server, providing your Pinecone API key and Pinecone Assistant host:
Use with Claude Desktop
-
Open Claude Desktop and go to Settings.
-
On the Developer tab, click Edit Config to open the configuration file.
-
Add the following configuration:
Replace
<YOUR_PINECONE_API_KEY>
with your Pinecone API key and<YOUR_PINECONE_ASSISTANT_HOST>
with your Pinecone Assistant host. -
Save the configuration file and restart Claude Desktop.
-
From the new chat screen, you should see a hammer (MCP) icon appear with the new MCP server available.
Use with Cursor
-
Open Cursor and create a
.cursor
directory in your project root if it doesn’t exist. -
Create a
.cursor/mcp.json
file if it doesn’t exist and open it. -
Add the following configuration:
Replace
<YOUR_PINECONE_API_KEY>
with your Pinecone API key and<YOUR_PINECONE_ASSISTANT_HOST>
with your Pinecone Assistant host. -
Save the configuration file.
Next Steps
-
Visit the Pinecone Assistant MCP Server repository for detailed installation and usage instructions
-
Learn about Model Context Protocol and how it enables AI agents to interact with tools and data
-
Explore retrieve context snippets to understand the underlying API functionality
Was this page helpful?