0.1.7 • Published 10 months ago
@locallm/api v0.1.7
LocalLm api
An api to query local language models using different backends. Supported backends:
:books: Api doc
Install
npm install @locallm/api
Usage
Example with the Koboldcpp provider:
import { Lm } from "@locallm/api";
const lm = new Lm({
providerType: "koboldcpp",
serverUrl: "http://localhost:5001",
onToken: (t) => process.stdout.write(t),
});
const template = "<s>[INST] {prompt} [/INST]";
const _prompt = template.replace("{prompt}", "list the planets in the solar system");
// run the inference query
await lm.infer(_prompt, {
stream: true,
temperature: 0.2,
n_predict: 200,
});
Check the examples directory for more examples
0.1.7
10 months ago
0.1.6
10 months ago
0.1.5
10 months ago
0.1.4
10 months ago
0.1.3
10 months ago
0.1.2
10 months ago
0.1.0
10 months ago
0.1.1
10 months ago
0.0.32
1 year ago
0.0.33
1 year ago
0.0.31
1 year ago
0.0.30
1 year ago
0.0.29
1 year ago
0.0.27
1 year ago
0.0.28
1 year ago
0.0.26
1 year ago
0.0.24
1 year ago
0.0.25
1 year ago
0.0.22
2 years ago
0.0.23
2 years ago
0.0.20
2 years ago
0.0.21
2 years ago
0.0.18
2 years ago
0.0.19
2 years ago
0.0.16
2 years ago
0.0.17
2 years ago
0.0.14
2 years ago
0.0.15
2 years ago
0.0.13
2 years ago
0.0.12
2 years ago
0.0.11
2 years ago
0.0.10
2 years ago
0.0.9
2 years ago
0.0.8
2 years ago
0.0.7
2 years ago
0.0.6
2 years ago
0.0.5
2 years ago
0.0.4
2 years ago
0.0.3
2 years ago
0.0.2
2 years ago
0.0.1
2 years ago