mistral-embed | Mistral AI

METRIC

cosine, dot product

DIMENSION

1024

MAX INPUT TOKENS

8192

TASK

embedding

Overview

High performance embedding model from Mistral AI, with a context window of 8k tokens. Optimized for retrieval and RAG applications.

Using the model

Installation:

!pip install mistralai pinecone

Create Index

from pinecone import Pinecone, ServerlessSpec

pc = Pinecone(api_key="API_KEY")

# Create Index
index_name = "mistral-embed"

if not pc.has_index(index_name):
    pc.create_index(
        name=index_name,
        dimension=1024,
        metric="cosine",
        spec=ServerlessSpec(
            cloud='aws',
            region='us-east-1'
        )
    )

index = pc.Index(index_name)

Embed & Upsert

# Embed data
data = [
    {"id": "vec1", "text": "Apple is a popular fruit known for its sweetness and crisp texture."},
    {"id": "vec2", "text": "The tech company Apple is known for its innovative products like the iPhone."},
    {"id": "vec3", "text": "Many people enjoy eating apples as a healthy snack."},
    {"id": "vec4", "text": "Apple Inc. has revolutionized the tech industry with its sleek designs and user-friendly interfaces."},
    {"id": "vec5", "text": "An apple a day keeps the doctor away, as the saying goes."},
]

from mistralai.client import MistralClient

client = MistralClient(api_key="MISTRAL_API_KEY")


def embed(docs: list[str]) -> list[list[float]]:

  embeddings = client.embeddings(
      model="mistral-embed",
      input=docs,
  )
  return embeddings.data

embeddings = embed([d["text"] for d in data])

# pull out embedding objects
embeddings = [e.embedding for e in embeddings]

vectors = []
for d, e in zip(data, embeddings):
    vectors.append({
        "id": d['id'],
        "values": e,
        "metadata": {'text': d['text']}
    })

index.upsert(
    vectors=vectors,
    namespace="ns1"
)

Query

query = "Tell me about the tech company known as Apple"

x = embed([query])[0].embedding

results = index.query(
    namespace="ns1",
    vector=x,
    top_k=3,
    include_values=False,
    include_metadata=True
)

print(results)

Lorem Ipsum

Was this page helpful?