Deploy a Plugin with Vercel Serverless Functions
Vercel is a platform that offers the ability to deploy your serverless functions straight from your development environment to the cloud.
Advantages of Using Vercel
- 🚀 Zero configuration deployment: Push your code, and Vercel handles the rest.
- 🌍 Global CDN: Delivers your content faster to the users, no matter where they are.
- ⚡️ Serverless Functions: Write your functions in Python, Node.js, Ruby, or Go and Vercel will run and scale them for you.
Deploying a Plugin with Vercel
Let's walk you through the process!
Step 1: Create a New Project
Start by creating a new project on your local machine. You can do this by making a new directory and initializing it with
git. This will create a new git repository in that directory.
Step 2: Write Your Serverless Function
Inside your project directory, create a new directory called api. Inside api, you can create Python files which will become your serverless functions. Here's an example of a simple serverless function that responds to HTTP GET requests:
from http.server import BaseHTTPRequestHandler
This function will be available at
Step 3: Create the Plugin Manifest
The manifest file, known as
ai-plugin.json, contains metadata about your plugin and is essential for the OpenAI Plugin System to recognize your service. In Vercel, this file should be placed in a public directory so it can be accessed from the internet.
In Vercel's case, we can use the
public directory to serve static files, and this is where we will place our
ai-plugin.json file. To be more specific, it should be placed in the
Here are the detailed steps to create the manifest file:
publicDirectory: If you have not created the
publicdirectory in your project's root directory, do so now with the command
.well-knownSubdirectory: Inside the
publicdirectory, create a subdirectory named
.well-known. This is a standard convention followed across the web for well-known services. Use the command
ai-plugin.jsonFile: Inside the
.well-knownsubdirectory, create your
ai-plugin.jsonfile. This file should contain metadata about your plugin, such as its name, version, and the paths for the
installendpoints. An example content for
Note that in the
urlsobject, the paths are relative to your deployment's root URL.
After completing these steps, your project's directory structure should look like this:
│ └── hello.py
│ └── ai-plugin.json
Step 4: Define the OpenAPI Specification
Next, return to the public directory and create an openapi.yaml or openapi.json file. This file will specify your plugin's API endpoints and the responses they should return. You can use the OpenAPI specification to describe your serverless functions' behavior.
Step 5: Deploy Your Function with Vercel
Finally, you can deploy your function to Vercel. You can do this by installing the Vercel CLI with npm i -g vercel, logging in with vercel login, and then running vercel inside your project directory. Your function will be deployed, and Vercel will give you a URL where you can access it.
Your Python function is now deployed with Vercel and can be used as a ChatGPT plugin!