0.1.16 • Published 12 months ago

@friendliai/ai-provider v0.1.16

Weekly downloads
-
License
Apache-2.0
Repository
github
Last release
12 months ago

@friendliai/ai-provider

Learn how to use the FriendliAI provider for the Vercel AI SDK.

Installation

You can install the package via npm:

npm i @friendliai/ai-provider

Credentials

The tokens required for model usage can be obtained from the Friendli suite.

To use the provider, you need to set the FRIENDLI_TOKEN environment variable with your personal access token.

export FRIENDLI_TOKEN="YOUR_FRIENDLI_TOKEN"

Check the FriendliAI documentation for more information.

Provider Instance

import { friendli } from "@friendliai/ai-provider";

Language Models

You can create FriendliAI models using a provider instance. The first argument is the model id, e.g. meta-llama-3.1-8b-instruct.

const model = friendli("meta-llama-3.1-8b-instruct");

Example: Generating text

You can use FriendliAI language models to generate text with the generateText function:

import { friendli } from "@friendliai/ai-provider";
import { generateText } from 'ai'

const { text } = await generateText({
  model: friendli('meta-llama-3.1-8b-instruct')
  prompt: 'What is the meaning of life?',
})

Example: Using built-in tools (Beta)

If you use @friendliai/ai-provider, you can use the built-in tools via the tools option.

Built-in tools allow models to use tools to generate better answers. For example, a web:search tool can provide up-to-date answers to current questions.

import { friendli } from "@friendliai/ai-provider";
import { convertToCoreMessages, streamText } from "ai";

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = await streamText({
    model: friendli("meta-llama-3.1-8b-instruct", {
      tools: [
        { type: "web:search" },
        { type: "math:calculator" },
        { type: "code:python-interpreter" }, // and more tools..!!
      ],
    }),
    messages: convertToCoreMessages(messages),
  });

  return result.toDataStreamResponse();
}

FriendliAI language models can also be used in the streamText, generateObject, streamObject, and streamUI functions. (see AI SDK Core and AI SDK RSC).

OpenAI Compatibility

🎯 We can also use @ai-sdk/openai with OpenAI compatibility.

import { createOpenAI } from "@ai-sdk/openai";

const friendli = createOpenAI({
  baseURL: "https://inference.friendli.ai/v1",
  apiKey: process.env.FRIENDLI_TOKEN,
});
0.1.15

12 months ago

0.1.16

12 months ago

0.1.14

12 months ago

0.1.13

1 year ago

0.1.12

1 year ago

0.1.10

1 year ago

0.1.11

1 year ago

0.1.8

1 year ago

0.1.7

1 year ago

0.1.9

1 year ago

0.1.6

1 year ago

0.1.2

1 year ago

0.1.4

1 year ago

0.1.3

1 year ago

0.1.5

1 year ago

0.1.1

1 year ago

0.0.4

1 year ago