@skit-ai/llm-studio-chat-websdk v1.1.3
LLM Studio Chat Web SDK
A lightweight, customizable chat widget for LLM Studio. This SDK allows you to easily integrate a chat interface with LLM Studio's text API into your website or application.
🚨 Browser Usage (Important)
If you're using this directly in a browser via CDN or a script tag, you MUST include the polyfill first:
<!-- 1. First include the polyfill -->
<script src="https://cdn.jsdelivr.net/npm/@skit-ai/llm-studio-chat-websdk/dist/polyfill.js"></script>
<!-- 2. Then include the SDK -->
<script src="https://cdn.jsdelivr.net/npm/@skit-ai/llm-studio-chat-websdk/dist/llm-studio-chat-websdk.umd.js"></script>
For detailed instructions on browser usage, see BROWSER_USAGE.md
Features
- 💬 Easy to integrate chat widget
- 🌐 Works with LLM Studio's text API
- 🧩 Multiple integration options (HTML, React)
- 🎨 Fully customizable appearance
- 📱 Responsive design
- 🚀 Small bundle size
- 🛡️ CSS isolation to prevent styling conflicts
Installation
Option 1: Via CDN (for plain HTML websites)
<script src="https://unpkg.com/llm-studio-chat-websdk@latest/dist/llm-studio-chat-websdk.umd.js"></script>
Option 2: Via npm (for React/JS applications)
npm install llm-studio-chat-websdk
Usage
Basic HTML Integration
Simply add the script to your HTML page and initialize the chat widget:
<script src="https://unpkg.com/llm-studio-chat-websdk@latest/dist/llm-studio-chat-websdk.umd.js"></script>
<script>
document.addEventListener('DOMContentLoaded', function() {
LLMStudioChatWebSDK.init({
clientId: "YOUR_CLIENT_ID", // Required
assistantId: "YOUR_ASSISTANT_ID", // Required
region: "us", // Required - your specific region (us or in)
// Optional customization
position: "bottom-right",
assistantName: "Support Bot",
welcomeMessage: "Hello! How can I help you today?",
suggestedPrompts: [
"What services do you offer?",
"How do I reset my password?",
"I need technical support"
],
primaryColor: "#4f46e5"
});
});
</script>
React Integration
Import and use the ChatWidget component:
import { ChatWidget } from 'llm-studio-chat-websdk';
function App() {
return (
<div>
<h1>My Website</h1>
<ChatWidget
clientId="YOUR_CLIENT_ID"
assistantId="YOUR_ASSISTANT_ID"
region="us" // 'us' or 'in'
position="bottom-right"
primaryColor="#4f46e5"
/>
</div>
);
}
Configuration Options
The chat widget accepts the following configuration options:
Required Parameters
Parameter | Type | Description |
---|---|---|
clientId | string | Your LLM Studio client ID |
assistantId | string | Your LLM Studio assistant ID |
region | string | Region for API requests (supported values: 'us', 'in') |
Appearance Customization
Parameter | Type | Default | Description |
---|---|---|---|
position | string | 'bottom-right' | Widget position ('bottom-right', 'bottom-left', 'top-right', 'top-left') |
assistantName | string | 'Assistant' | Name of the chat assistant |
assistantAvatar | string | undefined | URL to avatar image for the assistant |
welcomeMessage | string | 'Hello! How can I help you today?' | Initial message shown in the chat |
suggestedPrompts | string[] | [] | Array of suggested message buttons to show |
primaryColor | string | '#3b82f6' | Primary color for UI elements |
secondaryColor | string | '#6c757d' | Secondary color for UI elements |
height | string | '600px' | Height of the chat widget |
width | string | '380px' | Width of the chat widget |
Behavior Customization
Parameter | Type | Default | Description |
---|---|---|---|
defaultOpen | boolean | false | Whether the chat should be open by default |
embedded | boolean | false | Whether to render as embedded (no popup) |
autoSendFirstMessage | boolean | true | Whether to automatically send an initial message |
userId | string | 'anonymous' | User ID for tracking conversations |
Advanced Usage
CSS Prefixing to Prevent Style Conflicts
To prevent style conflicts with your website's CSS, all Tailwind CSS classes in the widget are prefixed with skit-ai-llm-
. This ensures that our styling won't be overridden by your custom CSS and vice versa.
For example, instead of using py-5
, we use skit-ai-llm-py-5
internally. This prefix is automatically applied and you don't need to use it when customizing the widget.
Embedded Chat
You can embed the chat directly into your page:
<div style={{ height: '500px', width: '100%', maxWidth: '400px' }}>
<ChatWidget
clientId="YOUR_CLIENT_ID"
assistantId="YOUR_ASSISTANT_ID"
region="us" // 'us' or 'in'
embedded={true}
/>
</div>
Controlled Mode
You can control the open/close state of the chat:
import { useState } from 'react';
import { ChatWidget } from 'llm-studio-chat-websdk';
function App() {
const [isChatOpen, setIsChatOpen] = useState(false);
return (
<div>
<h1>My Website</h1>
<button onClick={() => setIsChatOpen(true)}>Open Chat</button>
<ChatWidget
clientId="YOUR_CLIENT_ID"
assistantId="YOUR_ASSISTANT_ID"
region="us" // 'us' or 'in'
isOpen={isChatOpen}
onClose={() => setIsChatOpen(false)}
/>
</div>
);
}
Browser Support
The SDK supports all modern browsers:
- Chrome (latest)
- Firefox (latest)
- Safari (latest)
- Edge (latest)
License
MIT
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago