Reviewing "npm i ai"

Jun 18, 2023ยท

4 min read

Recently Vercel has published a new npm package to make building applications with AI easy. That package is simply called "ai". I like how they were able to get that!

Having built an app previously with OpenAI and hooking everything up manually, I wanted to see how easy this would make my life when creating something very trivial.

Here is the end result:

The idea

I want to make a one-page app that will simply ask a user to input some symptoms for a medical problem. Then I want the ai to act as a doctor and give it its best shot at figuring out what to do next. Disclaimer: As it is AI this will not be perfect, but it seemed like a cool idea.


  1. Have a next app created, you can use create-next-app for this.

  2. Have an OpenAI API account created.

Using the package

To install the package you just have to perform a very simple

npm i ai
yarn add ai
pnpm add ai

As I used OpenAi to power my app I had to add an extra package to make sure that my edge function will behave as I intend it to.

npm i openai-edge
yarn add openai-edge
pnpm add openai-edge

I then created my edge function to power the API endpoint, which also included a specifically worded prompt. Below is a code snippet of my API route.

API Route

import { Configuration, OpenAIApi } from "openai-edge";
import { OpenAIStream, StreamingTextResponse } from "ai";

const config = new Configuration({
  apiKey: process.env.OPENAI_API_KEY,
const openai = new OpenAIApi(config);

export const runtime = "edge";

export async function POST(req: Request) {
  const { prompt } = await req.json();

  const response = await openai.createCompletion({
    model: "text-curie-001", // curie because davinci is expensive.
    stream: true,
    max_tokens: 200,
    temperature: 0.2,
    prompt: `I want you to act as a doctor. My first request is "${prompt}."`,

  const stream = OpenAIStream(response);
  return new StreamingTextResponse(stream);

The fact that you can just handle the streaming of the OpenAi response in 2 lines of code that is so easily readable, blows my mind...

The prompt is an important trick to getting good responses from each LLM (Large Language Model). So make sure you spend some time getting the right response. You can try this out in Vercel's AI playground.

Here is where the magic happens for the front end that is really cool.

To interact with the above API endpoint and get full streaming capabilities via your UI. You can create a component that looks like this.

UI Component

'use client'

import { useCompletion } from 'ai/react';

export default function CompletionPage() {
  const { completion, input, handleInputChange, handleSubmit, isLoading } = useCompletion();

  return (
    <div className="mx-auto w-full max-w-md py-24 flex flex-col stretch">
      <form onSubmit={handleSubmit}>
          className="w-full max-w-md border border-gray-300 rounded mb-2 shadow-xl p-2"
          placeholder="Describe your symptoms..."
        <button className='bg-gray-800 hover:bg-gray-700 disabled:bg-gray-500 disabled:cursor-not-allowed w-fit text-white py-2 px-6 rounded-lg shadow-xl mt-4' disabled={isLoading} type="submit">
      <div className="whitespace-pre-wrap my-6">{completion}</div>

This provides the user with a simple form input and submission UI as well as the output of our API response.

What does each part of our deconstructed useCompletion() hook do?

  1. completion - This is the value of the in-flight streaming response which is updated as the UI interacts with our edge function.

  2. input - This is the value of our captured input field.

  3. handleInputChange - This is a function which will update the value of the input.

  4. handleSubmit - This will perform the submission of the form and by default call the API via the following route '/api/completion'. This can be customized by overriding the value of API inside of the hook like so: useCompletion({ api: '/api/chat'});

The way I like to look at this is that useCompletion provides everything needed to capture a value, submit it and tell us what is happening with that submission i.e. isLoading.

Closing thoughts

Overall, I like how easy Vercel have made it to get started with a simple AI application. The lower the barrier to entry with these sorts of apps, the quicker the innovation will be and that is a key consideration in today's environment. The API is really easy to use. From idea to deployment, it took me about 1 hour and a half. Most of which was polishing up my UI.

I encourage every reader to create a very simple next app and have a blast with this new SDK.

If you haven't yet tried out the package, I will leave some links below to help you get started.