Bring your own cloud (BYOC) lets you deploy Pinecone Database in your own AWS or GCP account to ensure data sovereignty and compliance, with Pinecone handling provisioning, operations, and maintenance.

BYOC is in public preview on AWS and GCP. To learn more about the offering, contact Pinecone.

Use cases

Pinecone BYOC is designed for organizations with high security and compliance requirements, for example:

  • Data sovereignty: If your organization has strict data governance policies, Pinecone BYOC can help ensure that all data is stored and processed locally and does not leave your security perimeter.
  • Data residency: The standard Pinecone managed service can be deployed in several AWS or GCP cloud regions. If your organization has specific data residency or latency constraints that require you to deploy in regions that Pinecone does not yet support, Pinecone BYOC gives you that flexibility.

Architecture

The BYOC architecture employs a split model:

  • Data plane: The data plane is responsible for storing and processing your records, executing queries, and interacting with object storage for index data. In a BYOC deployment, the data plane is hosted in your own AWS or GCP account within a dedicated VPC, ensuring that all data is stored and processed locally and does not leave your organizational boundaries. You use a private endpoint (AWS PrivateLink or GCP Private Service Connect) as an additional measure to secure requests to your indexes.
  • Control plane: The control plane is responsible for managing the index lifecycle as well as region-agnostic services such as user management, authentication, and billing. The control plane does not hold or process any records. In a BYOC deployment, the control plane is managed by Pinecone and hosted globally. Communication between the data plane and control plane is encrypted using TLS and employs role-based access control (RBAC) with minimal IAM permissions.

Onboarding

The onboarding process for BYOC in AWS or GCP involves the following general stages:

1

Set up AWS or GCP account

If you don’t already have an AWS or GCP account where you want to deploy Pinecone, you create one for this purpose.

2

Execute Terraform template

You download and run a Terraform template provided by Pinecone. This template creates essential resources, including an IAM role with scoped-down permissions and a trust relationship with Pinecone’s AWS or GCP account.

3

Create environment

Pinecone deploys a data plane cluster within a dedicated VPC in your AWS or GCP account, and you configure a private endpoint for securely connecting to your indexes via AWS PrivateLink or GCP Private Service Connect.

4

Validate

Once the environment is operational, Pinecone performs validation tests to ensure proper functionality.

Configure a private endpoint

You use a private endpoint to securely connect to your BYOC indexes. On AWS, you use the AWS PrivateLink service; on GCP, you use the GCP Private Service Connect service.

Follow the instructions in the AWS documentation to create a VPC endpoint for connecting to your indexes via AWS PrivateLink.

  • For Resource configurations, select the relevant resource for your Pinecone BYOC deployment.

  • For Network settings, select the VPC for your BYOC deployment.

  • In Additional settings, select Enable DNS name to allow you to access your indexes using a DNS name.

Create an index

Once your BYOC environment is ready, you can create a BYOC index in the Pinecone console or via the Pinecone API. Pinecone SDKs do not yet support BYOC index creation.

To create a BYOC index via the API, set the X-Pinecone-API-Version header to 2025-04 and the spec.byoc.environment parameter to the environment name provided to you during onboarding, for example:

curl
curl -s "https://api.pinecone.io/indexes" \
  -H "Accept: application/json" \
  -H "Content-Type: application/json" \
  -H "Api-Key: $PINECONE_API_KEY" \
  -H "X-Pinecone-API-Version: 2025-04" \
  -d '{
        "name": "example-byoc-index",
        "vector_type": "dense",
        "dimension": 1536,
        "metric": "cosine",
        "spec": {
            "byoc": {
                "environment": "aws-us-east-1-b921"
            }
        },
        "tags"={
            "example": "tag"
        },
        "deletion_protection": "disabled"
      }'

Read and write data

Once your private endpoint is configured, you can run data operations against a BYOC index as usual, but you must target the index using its private endpoint URL. The only difference in the URL is that .svc. is changed to svc.private. as shown in the example below.

BYOC does not support reading and writing data from the index browser in the Pinecone console.

# pip install "pinecone[grpc]"
from pinecone.grpc import PineconeGRPC as Pinecone

pc = Pinecone(api_key="YOUR_API_KEY")

# Use the Private Endpoint URL for host, which can be found in the 
# Pinecone console after you select the index to view more details.
index = pc.Index(host="https://example-index-a1b234c.svc.private.aped-4627-b74a.pinecone.io")

upsert_response = index.upsert(
    vectors=[
        {
          "id": "I",
          "values": [0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1]
        },
        {
          "id": "J", 
          "values": [0.2, 0.2, 0.2, 0.2, 0.2, 0.2, 0.2, 0.2]
        },
        {
          "id": "K", 
          "values": [0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3]
        },
        {
          "id": "L", 
          "values": [0.4, 0.4, 0.4, 0.4, 0.4, 0.4, 0.4, 0.4]
        }
    ]
)

Monitoring

Pinecone engineers monitor the state of your BYOC deployment and manage incidents if they arise. In addition, you can monitor performance metrics for your BYOC indexes in the Pinecone Console or with Prometheus or Datadog.

To use Prometheus, your monitoring tool must have access to your VPC.

FAQs