The Pinecone Assistant sample app demonstrates how to connect a chat interface to your Pinecone Assistant to answer complex questions on your proprietary data. This app allows users to upload PDF documents, process them, and then ask questions about the content using a chat interface.
From the project root directory, run the following command:
Copy
cd pinecone-assistant && npm install
Make sure you have populated the .env file with relevant keys:
Copy
PINECONE_API_KEY="your-pinecone-api-key-here"PINECONE_ASSISTANT_NAME="your-pinecone-assistant-name-here"# Set this if you want users chatting with your assistant to be able to see# and click into the files used as references in answersSHOW_ASSISTANT_FILES=true
This project uses a standard Next.js application structure with API routes for backend functionality.Frontend clientThe frontend uses Next.js, Tailwind CSS, and custom React components to power the chat interface.Backend serverThis project uses Next.js API routes to proxy requests to the Pinecone Assistant API.
Connect to existing Pinecone Assistant: Connect to an existing Pinecone Assistant to provide a chat experience that can be hosted privately or publicly.
Streaming responses: Ask questions of the assistant and get responses streamed to the frontend in real-time.
Reference highlighting: Documents that were used in answering user questions are highlighted as references.
Server action for chatThe server action creates a stream with Pinecone Assistants:
Copy
'use server'import { createStreamableValue } from 'ai/rsc'import { EventSource } from 'extended-eventsource';type Message = { role: string; content: string;}export async function chat(messages: Message[]) { // Create an initial stream, which we'll populate with events from the Pinecone Assistants API const stream = createStreamableValue() // Construct the full URL to the Pinecone Assistant API for the specific assistant // indicated by the user const url = `${process.env.PINECONE_ASSISTANT_URL}/${process.env.PINECONE_ASSISTANT_NAME}/chat/completions` const eventSource = new EventSource(url, { method: 'POST', body: JSON.stringify({ stream: true, messages, }), headers: { Authorization: `Bearer ${process.env.PINECONE_API_KEY}`, 'X-Project-Id': process.env.PINECONE_ASSISTANT_ID!, }, disableRetry: true, }); // When we receive a new message from the Pinecone Assistant API, we update the stream // unless the Assistant is done, in which case we close the stream eventSource.onmessage = (event: MessageEvent) => { const message = JSON.parse(event.data) if (message?.choices[0]?.finish_reason) { eventSource.close(); stream.done(); } else { stream.update(event.data) } }; eventSource.onerror = (error) => { console.error('EventSource error:', error); eventSource.close(); }; return { object: stream.value }}
Chat functionalityThe chat functionality in the Home component consumes the stream from the server action and updates the UI in real-time:
Copy
const handleChat = async () => { if (!input.trim()) return; const newUserMessage: Message = { id: uuidv4(), // Generate a unique ID role: 'user', content: input, timestamp: new Date().toISOString() }; setMessages(prevMessages => [...prevMessages, newUserMessage]); setInput(''); setIsStreaming(true); try { const { object } = await chat([newUserMessage]); let accumulatedContent = ''; const newAssistantMessage: Message = { id: uuidv4(), role: 'assistant', content: '', timestamp: new Date().toISOString(), references: [] }; setMessages(prevMessages => [...prevMessages, newAssistantMessage]); // Process the response stream from the Assistant that is created in the ./actions.ts Server action for await (const chunk of readStreamableValue(object)) { try { const data = JSON.parse(chunk); const content = data.choices[0]?.delta?.content; if (content) { accumulatedContent += content; } setMessages(prevMessages => { const updatedMessages = [...prevMessages]; const lastMessage = updatedMessages[updatedMessages.length - 1]; lastMessage.content = accumulatedContent; return updatedMessages; }); } catch (error) { console.error('Error parsing chunk:', error); } } // Extract references after the full message is received const extractedReferences = extractReferences(accumulatedContent); setReferencedFiles(extractedReferences); } catch (error) { console.error('Error in chat:', error); setError('An error occurred while chatting.'); } finally { setIsStreaming(false); } };