@fadedrifleman/web-sdk v4.5.1
QuibbleAI SDK Documentation
Installation
To install the QuibbleAI SDK, run the following command:
npm i @fadedrifleman/web-sdkImporting the SDK
Import the necessary modules from the SDK using the following syntax:
import { Quib, Transcriber, useVAD } from '@fadedrifleman/web-sdk';Configuration
Next.js
For a Next.js project, add the required plugins to your next.config.js:
/** @type {import('next').NextConfig} */
const CopyPlugin = require("copy-webpack-plugin");
const wasmPaths = [
"./node_modules/onnxruntime-web/dist/*.wasm",
"./node_modules/@ricky0123/vad-web/dist/silero_vad.onnx",
"./node_modules/@ricky0123/vad-web/dist/vad.worklet.bundle.min.js"
];
const nextConfig = {
webpack(config) {
config.module.rules.push({
test: /\.svg$/,
use: ["@svgr/webpack"],
});
config.resolve.alias = {
...config.resolve.alias,
sharp$: false,
"onnxruntime-node$": false,
};
config.plugins.push(
new CopyPlugin({
patterns: wasmPaths.map((p) => ({
from: p,
to: "static/chunks/app",
})),
})
);
// for Vercel
config.plugins.push(
new CopyPlugin({
patterns: wasmPaths.map((p) => ({
from: p,
to: "static/chunks",
})),
})
);
return config;
},
reactStrictMode: false,
async headers() {
return [
{
source: "/_next/(.*)",
headers: [
{
key: "Cross-Origin-Opener-Policy",
value: "require-corp",
},
{
key: "Cross-Origin-Embedder-Policy",
value: "require-corp",
},
],
},
];
},
};
module.exports = nextConfig;Vite React App
For a Vite React app, add the required plugins to your vite.config.js:
import react from "@vitejs/plugin-react-swc";
import { viteStaticCopy } from "vite-plugin-static-copy";
export default defineConfig({
plugins: [
react(),
viteStaticCopy({
targets: [
{
src: "node_modules/@ricky0123/vad-web/dist/vad.worklet.bundle.min.js",
dest: "./",
},
{
src: "node_modules/@ricky0123/vad-web/dist/silero_vad.onnx",
dest: "./",
},
{
src: "node_modules/onnxruntime-web/dist/*.wasm",
dest: "./",
},
],
}),
],
});Quib
The Quib class is a middleware that connects to the QuibbleAI websocket to handle various events such as user input, media output, end call, clear, and more.
Importing the Quib Class
import { Quib } from "@fadedrifleman/web-sdk";Creating a Quib Object
To create a Quib object, provide the following parameters:
protocol: Accepted values arewsfor development andwssfor production.host: The host URL provided by QuibbleAI.uid: A unique string value provided by the user for each connection.
const quib = new Quib({
protocol: "wss",
host: process.env.HOST,
uid: "unique_connection_ID"
});Handling Events
connected: Triggered when a connection to QuibbleAI's websocket is established.quib.on("connected", () => { console.log("Connection to websocket established"); });media: Providesmp3 bufferdata for audio playback. This event may occur multiple times for a single audio file.quib.on("media", (mediaPayload) => { console.log(mediaPayload?.media); });mark: Indicates that allmediaevents for the current interaction have been sent.quib.on("mark", () => { console.log("Audio data for the current interaction completely received"); });clear: Indicates that the server has detected an interruption and wants you to clear the currently playing media up to the mark event.quib.on("clear", () => { console.log("Clear the current playing audio up to the mark event"); });userText: Provides the user text input sent to the LLM.quib.on("userText", (data) => { console.log(data); });assistantText: Provides the text response from the LLM.quib.on("assistantText", (data) => { console.log(data); });endCall: Indicates the end of the conversation from the agent's side.quib.on("endCall", () => { console.log("Conversation end reached from agent side"); });close: Indicates that the websocket connection has been closed or is in the process of closing.quib.on("close", () => { console.log("Websocket has been closed or is in closing state"); });error: Triggered when there is an error with the websocket connection.quib.on("error", (error) => { console.log(error); });
Methods
gpt(text): Sends text input to the LLM.quib.gpt("Hello! How are you?");stop(): Initiates the closing of the websocket on your end.quib.stop();close(): Closes the websocket connection.quib.close();mark(): Indicates that you have finished playing the audio data for this interaction.quib.mark();interruption(text): Notifies the websocket of an interruption.quib.interruption("Pardon! I can't hear you.");keepAlive(): Keeps the connection alive during inactivity. Recommended to be sent every 5 seconds.quib.keepAlive();
Transcriber
The Transcriber class is a middleware that connects to the Deepgram websocket for audio transcription from your microphone.
Importing the Transcriber Class
import { Transcriber } from "@fadedrifleman/web-sdk";Creating a Transcriber Object
To create a Transcriber object, provide the following parameter:
apiKey: Your Deepgram API Key (provided by QuibbleAI).
const transcriber = new Transcriber({ apiKey: process.env.DEEPGRAM_API_KEY });Handling Events
connected: Triggered when a connection to the Deepgram websocket is established.transcriber.on("connected", () => { console.log("Connection to Deepgram websocket established"); });error: Triggered when there is an error with the Deepgram websocket connection.transcriber.on("error", (error) => { console.log(error); });close: Indicates that the websocket connection has been closed or is in the process of closing.transcriber.on("close", () => { console.log("Websocket has been closed or is in closing state"); });partialTranscription: Provides a partial transcription which may not be the final transcription.transcriber.on("partialTranscription", (text) => { console.log(text); });transcription: Provides the final transcription.transcriber.on("transcription", (text) => { console.log(text); });
Methods
send(): Sends audio blob data to Deepgram.transcriber.send();keepAlive(): Keeps the connection with the Deepgram websocket alive during inactivity. Recommended to be sent every 9 seconds.transcriber.keepAlive();close(): Closes the connection to Deepgram.transcriber.close();
useVad
The useVad hook is used for voice activity detection (VAD) with a media device object.
Importing the useVad Hook
import { useVad } from "@fadedrifleman/web-sdk";Using the useVad Hook
First, define the mediaDevice:
const uMedia = await navigator.mediaDevices.getUserMedia({
audio: {
noiseSuppression: true,
echoCancellation: true,
},
});Then, define the onSpeechStart and onSpeechEnd functions:
const onSpeechStartFunction = () => {
console.log("VAD detected start of speech");
};
const onSpeechEndFunction = () => {
console.log("VAD detected end of speech");
};Now, define the useVad hook:
const { start, pause } = useVad({
userMedia: uMedia,
onSpeechStart: onSpeechStartFunction,
onSpeechEnd: onSpeechEndFunction,
});The useVad hook returns two functions: start and pause.
start(): Starts VAD processing of the audio.start();pause(): Pauses VAD processing of the audio.pause();
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago