TemperStack
Intermediate12 min readUpdated Mar 18, 2026

How to create ai agents with functions on Vercel

Quick Answer

Create AI agents on Vercel by setting up a Next.js project with API routes for agent functions, configuring environment variables for AI services, and deploying using Vercel CLI. Use Vercel Functions to handle AI interactions and agent logic execution.

Prerequisites

  1. Node.js installed on your machine
  2. Vercel account with CLI access
  3. Basic understanding of JavaScript/TypeScript
  4. OpenAI API key or similar AI service credentials
1

Initialize Next.js project and install dependencies

Create a new Next.js project by running npx create-next-app@latest ai-agents --typescript --tailwind --eslint in your terminal. Navigate to the project directory and install AI libraries with npm install openai @vercel/analytics @vercel/functions. This sets up the foundation for your AI agent application.
Tip
Use TypeScript for better type safety when working with AI APIs and function responses.
2

Create agent function API routes

In the pages/api or app/api directory, create a new file called agent.ts. Set up your API route with export default async function handler(req: NextApiRequest, res: NextApiResponse) and implement your AI agent logic using OpenAI's function calling capabilities. Define your agent's available functions as an array of function schemas.
Tip
Structure your functions with clear descriptions and parameter schemas for better AI understanding.
3

Configure environment variables

Create a .env.local file in your project root and add your AI service credentials like OPENAI_API_KEY=your_key_here. In your Vercel dashboard, navigate to Settings > Environment Variables and add the same variables for production deployment. This ensures your AI agent can authenticate with external services.
Tip
Never commit API keys to version control - use Vercel's environment variable management instead.
4

Implement function execution logic

Create a functions directory in your project and define individual function files like weather.ts, calculator.ts, etc. Each function should export an async function that your AI agent can call. In your main agent API route, create a function registry that maps function names to their implementations using const functionRegistry = { weather, calculator }.
Tip
Keep functions pure and focused on single responsibilities for better agent performance.
5

Set up frontend interface

Create a chat interface in your main page component with a form for user input and a display area for agent responses. Use fetch('/api/agent', { method: 'POST', body: JSON.stringify({ message }) }) to send user messages to your agent. Implement streaming responses using Response streaming for real-time agent interactions.
Tip
Add loading states and error handling to improve user experience during AI processing.
6

Configure Vercel deployment settings

Create a vercel.json file in your project root to configure function timeouts and regions with {"functions": {"pages/api/agent.ts": {"maxDuration": 30}}}. This ensures your AI agent functions have sufficient time to process complex requests. Set appropriate memory limits based on your agent's computational needs.
Tip
Use edge runtime for faster cold starts: add 'export const runtime = "edge"' to your API routes.
7

Deploy and test your AI agent

Run vercel --prod in your terminal to deploy your AI agent to production. Vercel will automatically detect your Next.js project and configure the build settings. Test your deployed agent by visiting the provided URL and interacting with the chat interface to ensure all functions work correctly in the production environment.
Tip
Use Vercel's preview deployments to test changes before pushing to production.

Troubleshooting

API route timeout errors during AI processing
Increase the maxDuration setting in vercel.json to 60 seconds and consider implementing streaming responses to prevent timeouts during long AI operations.
Environment variables not accessible in production
Ensure environment variables are added in Vercel dashboard under Settings > Environment Variables and redeploy your project using vercel --prod --force.
AI agent functions not executing properly
Verify your function schemas match OpenAI's expected format and check that your function registry correctly maps function names to implementations. Enable debug logging to trace execution flow.
Cold start delays affecting agent response time
Switch to edge runtime by adding export const runtime = 'edge' to your API routes, or implement function warming using Vercel Cron Jobs to keep functions active.

Related Guides

More Vercel Tutorials

Other Tool Tutorials

Ready to get started with Vercel?

Put this tutorial into practice. Visit Vercel and follow the steps above.

Visit Vercel