3.5.7 • Published 9 months ago
llmtk v3.5.7
LLM Toolkit
A toolkit of LLM related functions to simplify the dev process, especially with Vercel.
Chatbot
Setup
middleware.ts
import { middleware as serviceMiddleware } from 'llmtk/build/serverMiddleware';
import { NextRequestWithAuth, withAuth } from 'next-auth/middleware';
import { NextFetchEvent, NextResponse } from 'next/server';
async function middleware(request: NextRequestWithAuth, event: NextFetchEvent) {
return (await serviceMiddleware(request)) || NextResponse.next();
}
export default withAuth(middleware);
api route
Create a file pages/api/google/tts.ts
:
import { api } from 'llmtk/build/serverApi';
const { handler } = api.google.tts;
export default handler;
Usage
- Use basic chat completion function on client side:
import { llm, services } from 'llmtk';
// Without stream
const reply = await llm.completion(services.openai.OpenAI_GPT3_5_4k, 'How are you');
// With stream
llm.completion(services.openai.OpenAI_GPT3_5_4k, 'How are you', {
onStream: ({ acc }) => console.log(acc),
});
- Use chat hook (include chat history and voice input/output):
import { llm, services } from 'llmtk';
const {
// chat history
history,
// llm states
replying,
// text input
sendTextMessage,
// or voice input
recorderStatus,
toggleRecording,
// voice output
startSpeaking,
stopSpeaking,
} = llm.useChat({
model: services.openai.OpenAI_GPT3_5_4k,
prompt: `You are C-3PO, human-cyborg relations.`,
voiceLanguage: 'en-US',
voiceName: 'en-US-Neural2-J',
});