0.0.3 • Published 10 months ago

fxn-llm v0.0.3

Weekly downloads
-
License
Apache-2.0
Repository
github
Last release
10 months ago

Function LLM for JavaScript

https://github.com/user-attachments/assets/86ae6012-264e-437f-9ab8-94408f4105ba

Dynamic JSON Badge X (formerly Twitter) Follow

Use local LLMs in your browser and Node.js apps. This package is designed to patch OpenAI and Anthropic clients for running inference locally, using predictors hosted on Function.

!TIP We offer a similar package for use in Python. Check out fxn-llm.

!IMPORTANT This package is still a work-in-progress, so the API could change drastically between all releases.

!CAUTION Never embed access keys client-side (i.e. in the browser). Instead, create a proxy URL in your backend.

Installing Function LLM

Function LLM is distributed on NPM. Open a terminal and run the following command:

# Run this in Terminal
$ npm install fxn-llm

!IMPORTANT Make sure to create an access key by signing onto Function. You'll need it to fetch the predictor at runtime.

Using the OpenAI Client Locally

To run text generation and embedding models locally using the OpenAI client, patch your OpenAI instance with the locally function:

import { locally } from "fxn-llm"
import { OpenAI } from "openai"

// 💥 Create your OpenAI client
let openai = new OpenAI({ apiKey: "fxn", dangerouslyAllowBrowser: true });

// 🔥 Make it local
openai = locally(openai, {
  accessKey: process.env.NEXT_PUBLIC_FXN_ACCESS_KEY
});

// 🚀 Generate embeddings
const embeddings = openai.embeddings.create({
  model: "@nomic/nomic-embed-text-v1.5-quant",
  input: "search_query: Hello world!"
});

!WARNING Currently, only openai.embeddings.create is supported. Text generation is coming soon!


Useful Links

Function is a product of NatML Inc.

0.0.3

10 months ago

0.0.2

10 months ago

0.0.1

11 months ago