TemperStack
Intermediate8 min readUpdated Mar 18, 2026

How to integrate ai sdk for embeddings on Vercel

Quick Answer

Integrate AI SDK for embeddings on Vercel by installing the @ai-sdk/openai package, configuring your API keys in environment variables, and creating API routes that generate embeddings using the embed function. Deploy your project to Vercel with proper environment variable configuration.

Prerequisites

  1. Basic knowledge of JavaScript/TypeScript
  2. Vercel account and project setup
  3. OpenAI or compatible AI provider API key
  4. Understanding of vector embeddings concepts
1

Install AI SDK Dependencies

Navigate to your project directory and install the required packages:
npm install ai @ai-sdk/openai
This installs the core AI SDK and OpenAI provider for generating embeddings.
Tip
You can also use other providers like @ai-sdk/anthropic or @ai-sdk/google depending on your needs
2

Configure Environment Variables

Create a .env.local file in your project root and add your API key:
OPENAI_API_KEY=your_openai_api_key_here
Make sure to add .env.local to your .gitignore file to keep your API key secure.
Tip
Never commit API keys to your repository - always use environment variables
3

Create Embedding API Route

Create a new file app/api/embeddings/route.ts (for App Router) or pages/api/embeddings.ts (for Pages Router):
import { embed } from 'ai';
import { openai } from '@ai-sdk/openai';

export async function POST(req: Request) {
const { text } = await req.json();

const { embedding } = await embed({
model: openai.embedding('text-embedding-3-small'),
value: text,
});

return Response.json({ embedding });
}
This creates an endpoint that accepts text and returns embeddings.
Tip
Use 'text-embedding-3-small' for cost efficiency or 'text-embedding-3-large' for better accuracy
4

Create Client-Side Integration

Create a component or page to use the embeddings API:
const generateEmbedding = async (text: string) => {
const response = await fetch('/api/embeddings', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ text }),
});

const { embedding } = await response.json();
return embedding;
};
This function sends text to your API route and receives the embedding vector.
Tip
Consider implementing error handling and loading states for better user experience
5

Configure Vercel Environment Variables

In your Vercel dashboard, go to your project settings and click Environment Variables. Add your API key:
  • Key: OPENAI_API_KEY
  • Value: Your actual API key
  • Environment: Production, Preview, and Development
Click Save to store the variable securely.
Tip
Set environment variables for all environments to ensure consistent behavior across deployments
6

Test Embeddings Locally

Start your development server to test the integration:
npm run dev
Test your embedding endpoint using a tool like curl or your frontend:
curl -X POST http://localhost:3000/api/embeddings \
-H "Content-Type: application/json" \
-d '{"text":"Hello, world!"}'
Verify that you receive a numerical array representing the embedding.
Tip
Check the browser network tab or server logs if you encounter any issues during testing
7

Deploy to Vercel

Deploy your project using the Vercel CLI or by pushing to your connected Git repository:
vercel --prod
Or commit and push your changes if using Git integration. Vercel will automatically build and deploy your project with the AI SDK embeddings functionality.
Tip
Monitor the deployment logs in Vercel dashboard to ensure successful deployment
8

Implement Vector Storage (Optional)

For production use, consider storing embeddings in a vector database. Install a vector database client:
npm install @pinecone-database/pinecone
Modify your API route to store embeddings:
// After generating embedding
await vectorDb.upsert({
id: generateId(),
values: embedding,
metadata: { text }
});
This enables similarity search and retrieval capabilities.
Tip
Popular vector databases for Vercel include Pinecone, Weaviate, and Supabase Vector

Troubleshooting

API key not found or authentication failed
Verify that OPENAI_API_KEY is correctly set in Vercel environment variables and redeploy your application. Check that the API key has sufficient credits and permissions.
Embedding API returns 500 internal server error
Check the Vercel function logs in your dashboard. Common issues include incorrect model names, malformed requests, or rate limiting. Ensure your embed function call syntax matches the AI SDK documentation.
Embeddings request times out
Large texts may exceed Vercel's serverless function timeout. Split long texts into smaller chunks before processing, or consider upgrading to Vercel Pro for longer execution times.
CORS errors when calling embedding API from frontend
Ensure your API route handles CORS properly by adding appropriate headers: headers: { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'POST' } or use Next.js built-in API routes which handle CORS automatically.

Related Guides

More Vercel Tutorials

Other Tool Tutorials

Ready to get started with Vercel?

Put this tutorial into practice. Visit Vercel and follow the steps above.

Visit Vercel