@upstash/rag-chat-widget v0.4.3
RAG Chat Widget
A customizable Reach chat widget that combines Upstash Vector for similarity search, Together AI for LLM, and Vercel AI SDK for streaming responses. This ready-to-use component provides an out of the box solution for adding RAG-Powered chat interfaces to your Next.js application.
Features
ā” Streaming responses support
š» Server actions
š± Responsive design
š Real-time context retrieval
š¾ Persistent chat history
šØ Fully customizable UI components
šØ Dark/light mode support
Installation
# Using npm
npm install @upstash/rag-chat-widget
# Using pnpm
pnpm add @upstash/rag-chat-widget
# Using yarn
yarn add @upstash/rag-chat-widget
Quick Start
1. Environment Variables
Create an Upstash Vector database and set up the environment variables as below. If you don't have an account, you can start by going to Upstash Console.
Choose an embedding model when creating an index in Upstash Vector.
UPSTASH_VECTOR_REST_URL=
UPSTASH_VECTOR_REST_TOKEN=
OPENAI_API_KEY=
TOGETHER_API_KEY=
# Optional
TOGETHER_MODEL=
2. Configure Styles
In your tailwind.config.ts
file, add the configuration below:
import type { Config } from "tailwindcss";
export default {
content: ["./node_modules/@upstash/rag-chat-widget/**/*.{js,mjs}"],
} satisfies Config;
3. Implementation
The RAG Chat Widget can be integrated into your application using two straightforward approaches. Choose the method that best fits your project structure:
1. Using a Dedicated Component File (Recommended)
Create a seperate component file with the use client
directive, then import and use it anywhere in your application.
// components/widget.tsx
"use client";
import { ChatWidget } from "@upstash/rag-chat-widget";
export const Widget = () => {
return <ChatWidget />;
};
// page.tsx
import { Widget } from "./components/widget";
export default function Home() {
return (
<>
<Widget />
<p>Home</p>
</>
);
}
2. Direct Integration in Client Components
Alternatively, import and use the ChatWidget directly in your client-side pages.
// page.tsx
"use client";
import { ChatWidget } from "@upstash/rag-chat-widget";
export default function Home() {
return (
<>
<ChatWidget />
<p>Home</p>
</>
);
}
4. Choosing Chat Model
It's possible to choose one of the together.ai models for the chat.
Default model is meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo
. You can configure it in the environment variables.
TOGETHER_MODEL="deepseek-ai/DeepSeek-V3"
Adding Content
You can add content to your RAG Chat widget in several ways:
The SDK provides methods to add various types of content programmatically:
import { RAGChat, openai } from "@upstash/rag-chat";
export const ragChat = new RAGChat({
model: openai("gpt-4-turbo"),
});
// Add text content
await ragChat.context.add("Your text content here");
// Add PDF documents
await ragChat.context.add({
type: "pdf",
fileSource: "./path/to/document.pdf",
});
// Add web content
await ragChat.context.add({
type: "html",
source: "https://your-website.com",
});
For more detailed examples and options, check out the RAG Chat documentation.
You can also manage your content directly through the Upstash Vector Console:
- Navigate to Upstash Console.
- Go to details page of the Vector database.
- Navigate to Databrowser Tab.
- Here, you can either upload a PDF, or use on of our sample datasets.
Contributing
We welcome contributions! Please see our contributing guidelines for more details.
License
MIT License - see the LICENSE file for details.
5 months ago
5 months ago
5 months ago
5 months ago
5 months ago
5 months ago
5 months ago
6 months ago
6 months ago
6 months ago
6 months ago
7 months ago
7 months ago
7 months ago
7 months ago
7 months ago
7 months ago
7 months ago
7 months ago