WebGPU Packages

offload-ai

Offload is an in-browser AI stack. It provides an SDK to run AI inference directly on your users' devices, increasing their data privacy and saving on inference costs.

1.2.1 • Published 9 months ago

react-llm

Call 30+ LLMs with a single API. * Send multiple prompts to multiple LLMs and get the results back in a single response. * Lightweight and fast (only one dependency: axios) * Bring your own API keys * Works anywhere (Node, Deno, browser)

0.0.1 • Published 2 years ago