Skip to main content

Deploy a Plugin with Vercel Serverless Functions

Vercel is a platform that offers the ability to deploy your serverless functions straight from your development environment to the cloud.

Advantages of Using Vercel

  • 🚀 Zero configuration deployment: Push your code, and Vercel handles the rest.
  • 🌍 Global CDN: Delivers your content faster to the users, no matter where they are.
  • ⚡️ Serverless Functions: Write your functions in Python, Node.js, Ruby, or Go and Vercel will run and scale them for you.

Deploying a Plugin with Vercel

Let's walk you through the process!

Step 1: Create a New Project

Start by creating a new project on your local machine. You can do this by making a new directory and initializing it with git. This will create a new git repository in that directory.

mkdir my-vercel-function
cd my-vercel-function
git init

Step 2: Write Your Serverless Function

Inside your project directory, create a new directory called api. Inside api, you can create Python files which will become your serverless functions. Here's an example of a simple serverless function that responds to HTTP GET requests:

# api/hello.py
from http.server import BaseHTTPRequestHandler

class handler(BaseHTTPRequestHandler):
def do_GET(self):
self.send_response(200)
self.send_header('Content-type','text/plain')
self.end_headers()
self.wfile.write('Hello, world!'.encode('utf-8'))
return

This function will be available at /api/hello.

Step 3: Create the Plugin Manifest

The manifest file, known as ai-plugin.json, contains metadata about your plugin and is essential for the OpenAI Plugin System to recognize your service. In Vercel, this file should be placed in a public directory so it can be accessed from the internet.

In Vercel's case, we can use the public directory to serve static files, and this is where we will place our ai-plugin.json file. To be more specific, it should be placed in the public/.well-known subdirectory.

Here are the detailed steps to create the manifest file:

  1. Create the public Directory: If you have not created the public directory in your project's root directory, do so now with the command mkdir public.

  2. Create the .well-known Subdirectory: Inside the public directory, create a subdirectory named .well-known. This is a standard convention followed across the web for well-known services. Use the command mkdir public/.well-known.

  3. Create the ai-plugin.json File: Inside the .well-known subdirectory, create your ai-plugin.json file. This file should contain metadata about your plugin, such as its name, version, and the paths for the openapi and install endpoints. An example content for ai-plugin.json can be:

    {
    "name": "your-plugin-name",
    "version": "1.0.0",
    "urls": {
    "openapi": "/openapi.yaml",
    "install": "/api/install"
    }
    }

    Note that in the urls object, the paths are relative to your deployment's root URL.

After completing these steps, your project's directory structure should look like this:

.
├── api
│ └── hello.py
└── public
├── .well-known
│ └── ai-plugin.json
└── openapi.yaml

Step 4: Define the OpenAPI Specification

Next, return to the public directory and create an openapi.yaml or openapi.json file. This file will specify your plugin's API endpoints and the responses they should return. You can use the OpenAPI specification to describe your serverless functions' behavior.

Step 5: Deploy Your Function with Vercel

Finally, you can deploy your function to Vercel. You can do this by installing the Vercel CLI with npm i -g vercel, logging in with vercel login, and then running vercel inside your project directory. Your function will be deployed, and Vercel will give you a URL where you can access it.

Your Python function is now deployed with Vercel and can be used as a ChatGPT plugin!