@llamaindex/autotool v5.0.3
@llamaindex/autotool
Auto transpile your JS function to LLM Agent compatible
Usage
First, Install the package
npm install @llamaindex/autotool
pnpm add @llamaindex/autotool
yarn add @llamaindex/autotool
Second, Add the plugin/loader to your configuration:
Next.js
import { withNext } from "@llamaindex/autotool/next";
/** @type {import('next').NextConfig} */
const nextConfig = {};
export default withNext(nextConfig);
Node.js
node --import @llamaindex/autotool/node ./path/to/your/script.js
Third, add "use tool"
on top of your tool file or change to .tool.ts
.
"use tool";
export function getWeather(city: string) {
// ...
}
// ...
Finally, export a chat handler function to the frontend using llamaindex
Agent
"use server";
// imports ...
export async function chatWithAI(message: string): Promise<JSX.Element> {
const agent = new OpenAIAgent({
tools: convertTools("llamaindex"),
});
const uiStream = createStreamableUI();
agent
.chat({
stream: true,
message,
})
.then(async (responseStream) => {
return responseStream.pipeTo(
new WritableStream({
start: () => {
uiStream.append("\n");
},
write: async (message) => {
uiStream.append(message.response.delta);
},
close: () => {
uiStream.done();
},
}),
);
});
return uiStream.value;
}
License
MIT
8 months ago
8 months ago
8 months ago
9 months ago
9 months ago
9 months ago
9 months ago
9 months ago
9 months ago
9 months ago
9 months ago
9 months ago
9 months ago
9 months ago
9 months ago
9 months ago
9 months ago
9 months ago
9 months ago
9 months ago
9 months ago
9 months ago
10 months ago
10 months ago
9 months ago
9 months ago
9 months ago
9 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
1 year ago
11 months ago
1 year ago
1 year ago