Documentation Index
Fetch the complete documentation index at: https://docs.pinecone.io/llms.txt
Use this file to discover all available pages before exploring further.
This feature is in public preview.
What an evaluation does
When a deployment publishes, Marketplace:- Generates a set of test questions from the connected sources.
- Asks the new version to answer each test question.
- Scores each response on faithfulness and relevance using an LLM judge.
- Records the aggregate result in the version history.
Metrics
| Metric | What it measures |
|---|---|
| Faithfulness | Whether the answer is grounded in the cited sources. |
| Relevance | Whether the answer addresses the question. |
Test question generation
Test questions are generated automatically from the connected content; they are not curated by hand. To get more representative tests, keep the connected sources focused on the domain the application serves.Comparing versions
The deployment dashboard shows evaluation results per version. Use the comparison view to see:- Aggregate score deltas between versions.
- Per-question pass and fail changes.
- Sources cited per response.