Shop The Look
Shop The Look
Build a multimodal search engine for finding outfit inspiration with Pinecone, Google Cloud Vertex AI, and Vercel
$ npx create-pinecone-app@latest --template shop-the-look
The Shop The Look sample app demonstrates how to build a multimodal search engine for finding outfit inspiration using Pinecone Vector Database, Google Cloud Vertex AI’s Multimodal Embedding Model, and assets from Pexels. This application showcases how easy it is to use Pinecone in combining text, image, and video inputs to provide highly relevant outfit recommendations (or other multimodal use cases).
Built with
- Pinecone Serverless
- Google Cloud Vertex AI Multimodal Embedding Model
- Next.js + Tailwind CSS
- Vercel
- NodeJS
Run the sample app
The fastest way to get started is to use the create-pinecone-app
CLI tool to run Shop The Look in demo mode.
Note: Demo mode is for developers who want to quickly deploy and test the Shop The Look application without setting up their own backend services or supply their own image/video assets. This demo deployment includes over 45,000 royalty-free images and videos, allowing you to deploy the front-end locally, while utilizing our hosted backend API (which we have set up with all assets, Pinecone Serverless index, and Google Cloud Vertex AI).
Full deployment
For developers who want to deploy a fully customizable Shop The Look application with their own images and videos, we offer a full deployment option. This method requires setting up both the frontend and backend components, including Pinecone Serverless, Google Cloud Vertex AI, and Google Cloud Storage, and uploading your own images and videos.
A short version of the full deployment is listed below.
For the complete documentation, click here for the Shop The Look Full Deployment instructions.
Get your Pinecone API key
You need an API key to make calls to your Pinecone project:
Then copy your generated key:
Alternatively, follow these steps:
- Open the Pinecone console.
- Select your project.
- Go to API Keys.
- Copy your API key.
Get your Google Cloud credentials
We recommend you follow the full docs for setting up your Google Cloud credentials. Below is a summary of the instructions.
- Create an account in Google Cloud if you don’t already have one.
- Create a new project in the Google Cloud Console.
- Enable the
Vertex AI API
andCloud Storage API
. - Create a service account with
Vertex AI User
andStorage Object Viewer
roles. - Generate and download a JSON key for the service account.
Create a Pinecone serverless index
Create a Pinecone index for this project with the following properties:
- dimension:
1408
- metric:
cosine
- region: Choose your preferred region
You can create the index in the console, or by following the instructions here.
Start the project
Requires Node version 14+
Dependency installation
From the project root directory, run the following command:
Make sure you have populated the .env.development
with relevant keys:
Start the app:
Project structure
Shop The Look uses a NextJS frontend with a FastAPI Python backend. Utility scripts are included to process images and videos for embedding + upserting.
Frontend Client
The frontend uses Next.js, Tailwind CSS, and custom React components to power the search experience. It leverages custom API routes to make calls to the FastAPI backend for embedding and searching.
Backend Server
This project uses FastAPI to handle image/video processing, embedding generation, and Pinecone operations.
Utility Scripts
The image and video embedding processing scripts process a folder of images and videos from Google Cloud Storage, generate embeddings using Vertex AI, and upserts them to Pinecone Index.
Architecture
Shop The Look is built using Pinecone Vector Database, Google Cloud Vertex AI’s Multimodal Embedding Model, assets from Pexels, and is hosted on Vercel.
Using pseudocode, we will explain how we built key components of Shop The Look.
Pinecone Serverless
This project uses Pinecone to store all the multimodal embeddings generated by Vertex AI’s Multimodal Embedding Model.
Google Cloud Vertex AI integration
This app uses Google Cloud Vertex AI’s Multimodal Embedding Model to generate embeddings for text, images, and videos:
Search functionality
When the user executes a search, their query (text or image/video) is sent to the backend, which uses the Multimodal Embedding Model to convert the query into vector embeddings:
These embeddings are then used to perform a similarity search in Pinecone:
The results are then displayed in the React frontend page.tsx
Troubleshooting
Experiencing any issues with the sample app? Read the Troubleshooting section of the Shop The Look README. If issues still persist, submit an issue, create a PR, or post in our community forum)!
Was this page helpful?