1.1.2 • Published 4 months ago
react-llm-chatbot v1.1.2
React Easy Chatbot
A customizable React chatbot component that can be easily integrated into any React application.
Features
- 🚀 Simple to integrate with any React app
- ⚙️ Highly customizable appearance
- 🌙 Light and dark themes
- 📱 Mobile-responsive design
- 🔄 Easy API integration
- 🔒 Token-based authentication
Installation
npm install react-llm-chatbot
# or
yarn add react-llm-chatbot
Quick Start
import React from 'react';
import Chatbot from 'react-llm-chatbot';
const App = () => {
// You need to obtain this token from your backend service
const token = "your-api-access-token";
return (
<div className="App">
<Chatbot
apiBaseURL="https://your-chatbot-api.com/api"
token={token}
theme="light"
onError={(error) => console.error('Chatbot error:', error)}
/>
</div>
);
};
export default App;
Configuration Options
Prop | Type | Default | Description |
---|---|---|---|
apiBaseURL | string | - | Base URL for the chatbot API |
token | string | - | Required Authentication token for API access |
theme | string | 'light' | Theme ('light' or 'dark') |
position | string | 'bottom-right' | Position of the chatbot ('bottom-right', 'bottom-left', 'top-right', 'top-left') |
width | number/string | 350 | Width of the chat window |
height | number/string | 500 | Height of the chat window |
title | string | 'Chat Assistant' | Title shown in the chat header |
placeholder | string | 'Type a message...' | Placeholder for the input field |
showHeader | boolean | true | Whether to show the chat header |
showFooter | boolean | true | Whether to show the chat footer |
customStyles | object | {} | Custom CSS variables to override default styles |
onError | function | () => {} | Callback when an error occurs |
API Integration
This chatbot component is designed to work with a backend API that provides the following endpoints:
1. /chat
- Method:
POST
- Description: Sends a message to the chatbot
- Headers:
Authorization: Bearer <token>
- Request Body:
{ "message": "user_message" }
- Response:
{ "status": "success", "response": "chatbot_response" }
2. /test
- Method:
GET
- Description: Tests the API connection
- Response:
{ "status": "ok", "message": "API is working!", "active_tokens": [number_of_active_tokens] }
Advanced Usage
Using the ChatbotProvider
For more control, you can use the ChatbotProvider
to manage the chatbot state in your application:
import React from 'react';
import { ChatbotProvider } from 'react-llm-chatbot';
const App = () => {
return (
<ChatbotProvider
apiBaseURL="https://your-chatbot-api.com/api"
token="your-api-token"
>
{/* Your components can now use the useChatbot hook */}
<YourCustomChat />
</ChatbotProvider>
);
};
Using the useChatbot Hook
import React from 'react';
import { useChatbot } from 'react-llm-chatbot';
const YourCustomChat = () => {
const {
messages,
sendMessage,
isLoading,
clearMessages
} = useChatbot();
// Custom chat UI implementation
return (
<div>
{/* Your custom UI */}
</div>
);
};
License
MIT