0.0.10 • Published 11 months ago

workers-ai-provider v0.0.10

Weekly downloads
-
License
MIT
Repository
github
Last release
11 months ago

⛅️ ✨ workers-ai-provider ✨ ⛅️

A custom provider that enables Workers AI's models for the Vercel AI SDK.

Install

npm install workers-ai-provider

Usage

First, setup an AI binding in wrangler.toml in your Workers project:

# ...
[ai]
binding = "AI"
# ...

Then in your Worker, import the factory function and create a new AI provider:

// index.ts
import { createWorkersAI } from "workers-ai-provider";
import { streamText } from "ai";

type Env = {
  AI: Ai;
};

export default {
  async fetch(req: Request, env: Env) {
    const workersai = createWorkersAI({ binding: env.AI });
    // Use the AI provider to interact with the Vercel AI SDK
    // Here, we generate a chat stream based on a prompt
    const text = await streamText({
      model: workersai("@cf/meta/llama-2-7b-chat-int8"),
      messages: [
        {
          role: "user",
          content: "Write an essay about hello world",
        },
      ],
    });

    return text.toTextStreamResponse({
      headers: {
        // add these headers to ensure that the
        // response is chunked and streamed
        "Content-Type": "text/x-unknown",
        "content-encoding": "identity",
        "transfer-encoding": "chunked",
      },
    });
  },
};

For more info, refer to the documentation of the Vercel AI SDK.

Credits

Based on work by Dhravya Shah and the Workers AI team at Cloudflare.

0.0.10

11 months ago

0.0.9

11 months ago

0.0.8

11 months ago

0.0.7

11 months ago

0.0.5

11 months ago

0.0.4

11 months ago

0.0.3

11 months ago

0.0.2

11 months ago

0.0.1

11 months ago