@mencraft/lore-headless v0.1.22
Lore Headless
A headless library for chat streaming functionality that works with React and other frontend frameworks.
Installation
npm install @mencraft/lore-headlessConfiguration
When using this library, you'll need to provide the following authentication values:
// Authentication credentials
const auth = {
clientId: 'your-client-id', // Your client ID
token: 'your-firebase-id-token' // Your Firebase ID token
};
// Graph ID for chat operations
const graphId = 'your-graph-id'; // Your graph IDAPI URLs
By default, the API client is configured for production URLs. You can specify different modes or use custom URLs:
// Default usage (production mode)
const apiClient = new LoreClient();
// Specify a different mode
const stagingClient = new LoreClient('staging');
const devClient = new LoreClient('development');
// Use custom URLs (mode will be ignored when custom URLs are provided)
const customClient = new LoreClient(
'production', // This will be ignored when custom URLs are provided
'https://your-custom-chat-url.com',
'https://your-custom-validation-url.com'
);Available modes:
'production'(default): Uses production API URLs'staging': Uses staging API URLs'development': Uses local development URLs (localhost)
Custom URLs:
When providing custom URLs, both customChatUrl and customValidationUrl must be provided together. If custom URLs are specified, the mode parameter will be ignored.
In a real application, you would typically set the authentication values using environment variables:
// Authentication credentials
const auth = {
clientId: process.env.CLIENT_ID,
token: process.env.FIREBASE_TOKEN
};
// Graph ID for chat operations
const graphId = process.env.GRAPH_ID;Usage
Core API (Framework Agnostic)
import { LoreClient, ChatStreamManager, ChatRequest, ChatStreamCallbacks } from 'lore-headless';
// Create an API client
// Default usage (production mode)
const apiClient = new LoreClient();
// Or specify a different mode
// const apiClient = new LoreClient('staging');
// const apiClient = new LoreClient('development');
// Or use custom URLs
// const apiClient = new LoreClient(
// 'production', // mode will be ignored
// 'https://your-custom-chat-url.com',
// 'https://your-custom-validation-url.com'
// );
// Set up authentication
const auth = {
clientId: 'your-client-id',
token: 'your-firebase-id-token'
};
// Define callbacks
const callbacks: ChatStreamCallbacks = {
onTextStream: (text) => {
console.log('Received text:', text);
},
onBatchData: (data) => {
console.log('Received batch data:', data);
},
onError: (error) => {
console.error('Error:', error);
},
onEnd: () => {
console.log('Stream ended');
}
};
// Create a chat stream manager
const chatManager = new ChatStreamManager(apiClient, auth, callbacks);
// Start a chat stream
const request: ChatRequest = {
user: 'user-id',
session: 'session-id',
query: 'Hello, how are you?'
};
chatManager.startChatStream(request, 'graph-id');
// Cancel the stream if needed
chatManager.cancelStream();
// Send feedback for a specific assistant message
// This should be called for each assistant message that receives feedback
chatManager.sendFeedback({
user: 'user-id',
session: 'session-id',
query: 'Hello, how are you?', // The user message that prompted the assistant's response
response: 'I am doing well, thank you!', // The assistant's response being rated
feedback: 'Great response!', // Optional feedback text
feedback_value: 1 // Positive (1) or negative (-1) rating
}, 'graph-id');React Hooks
To use the React hooks, you need to import them from the React entry point:
Callback-based Approach (Original)
import React, { useState } from 'react';
import { LoreClient, ChatRequest } from 'lore-headless';
import { useChatStream, useFeedback } from 'lore-headless/react';
// Message interface with unique ID
interface Message {
id: string;
text: string;
sender: 'user' | 'assistant' | 'system' | 'error';
timestamp: Date;
}
// Feedback modal interface
interface FeedbackModalState {
isOpen: boolean;
messageId: string;
assistantMessage: string;
userMessage: string;
}
const ChatComponent = () => {
// Messages state
const [messages, setMessages] = useState<Message[]>([]);
// Feedback modal state
const [feedbackModal, setFeedbackModal] = useState<FeedbackModalState>({
isOpen: false,
messageId: '',
assistantMessage: '',
userMessage: ''
});
// Feedback form state
const [feedbackText, setFeedbackText] = useState('');
const [feedbackRating, setFeedbackRating] = useState<number | null>(null);
// Create an API client
// Default usage (production mode)
const apiClient = new LoreClient();
// Or specify a different mode
// const apiClient = new LoreClient('staging');
// const apiClient = new LoreClient('development');
// Or use custom URLs
// const apiClient = new LoreClient(
// 'production', // mode will be ignored
// 'https://your-custom-chat-url.com',
// 'https://your-custom-validation-url.com'
// );
// Set up authentication
const auth = {
clientId: 'your-client-id',
token: 'your-firebase-id-token'
};
// Use the chat stream hook
const { startChatStream, cancelStream } = useChatStream(apiClient, auth, {
onTextStream: (text) => {
setMessages((prev) => {
const lastMessage = prev[prev.length - 1];
if (lastMessage?.sender === 'assistant') {
return [
...prev.slice(0, -1),
{ ...lastMessage, text: lastMessage.text + text }
];
}
return [
...prev,
{
id: Date.now().toString(),
text,
sender: 'assistant',
timestamp: new Date()
}
];
});
},
onError: (error) => {
console.error('Stream error:', error);
},
onEnd: () => {
console.log('Stream ended');
}
});
// Use the feedback hook
const { sendFeedback, loading } = useFeedback(apiClient, auth, 'graph-id');
// Handle sending a message
const handleSendMessage = (message: string) => {
// Add user message to the list
setMessages((prev) => [
...prev,
{
id: Date.now().toString(),
text: message,
sender: 'user',
timestamp: new Date()
}
]);
// Start the chat stream
const request: ChatRequest = {
user: 'user-id',
session: 'session-id',
query: message
};
startChatStream(request, 'graph-id');
};
// Open feedback modal for a specific message
const openFeedbackModal = (messageId: string) => {
const assistantMessage = messages.find(m => m.id === messageId && m.sender === 'assistant');
if (!assistantMessage) return;
// Find the user message that prompted this assistant response
// This is a simplified example - in a real app, you'd track which user message led to which assistant response
const userMessageIndex = messages.findIndex(m => m.id === messageId) - 1;
if (userMessageIndex < 0) return;
const userMessage = messages[userMessageIndex];
if (userMessage.sender !== 'user') return;
setFeedbackModal({
isOpen: true,
messageId,
assistantMessage: assistantMessage.text,
userMessage: userMessage.text
});
// Reset form state
setFeedbackText('');
setFeedbackRating(null);
};
// Close feedback modal
const closeFeedbackModal = () => {
setFeedbackModal(prev => ({ ...prev, isOpen: false }));
};
// Submit feedback
const submitFeedback = () => {
if (feedbackRating === null) {
alert('Please select a rating before submitting.');
return;
}
sendFeedback({
user: 'user-id',
session: 'session-id',
query: feedbackModal.userMessage,
response: feedbackModal.assistantMessage,
feedback: feedbackText || (feedbackRating > 0 ? 'Helpful response' : 'Not helpful response'),
feedback_value: feedbackRating
});
closeFeedbackModal();
};
return (
<div className="chat-container">
{/* Messages list */}
<div className="messages">
{messages.map((message) => (
<div key={message.id} className={`message ${message.sender}`}>
<div className="message-content">{message.text}</div>
<div className="message-timestamp">
{message.timestamp.toLocaleTimeString()}
</div>
{/* Add feedback button to assistant messages */}
{message.sender === 'assistant' && (
<button
className="feedback-button"
onClick={() => openFeedbackModal(message.id)}
>
Give Feedback
</button>
)}
</div>
))}
</div>
{/* Message input form */}
<form
className="input-form"
onSubmit={(e) => {
e.preventDefault();
const input = e.currentTarget.elements.namedItem('message') as HTMLInputElement;
if (input && input.value.trim()) {
handleSendMessage(input.value.trim());
input.value = '';
}
}}
>
<input
type="text"
name="message"
placeholder="Type your message..."
disabled={loading}
/>
<button type="submit" disabled={loading}>Send</button>
</form>
{/* Feedback modal */}
{feedbackModal.isOpen && (
<div className="modal-overlay">
<div className="modal-content">
<div className="modal-header">
<h3>Provide Feedback</h3>
<button className="close-button" onClick={closeFeedbackModal}>×</button>
</div>
<div className="modal-body">
<div className="message-context">
<div className="context-label">Assistant's response:</div>
<div className="context-text">{feedbackModal.assistantMessage}</div>
</div>
<textarea
className="feedback-textarea"
placeholder="Enter your feedback here..."
value={feedbackText}
onChange={(e) => setFeedbackText(e.target.value)}
/>
<div className="rating-container">
<div className="rating-label">How would you rate this response?</div>
<button
className={`rating-button positive ${feedbackRating === 1 ? 'selected' : ''}`}
onClick={() => setFeedbackRating(1)}
>
👍 Helpful
</button>
<button
className={`rating-button negative ${feedbackRating === -1 ? 'selected' : ''}`}
onClick={() => setFeedbackRating(-1)}
>
👎 Not Helpful
</button>
</div>
<button
className="submit-button"
onClick={submitFeedback}
disabled={loading}
>
Submit Feedback
</button>
</div>
</div>
</div>
)}
</div>
);
};Running the Examples
Vanilla JavaScript Example
To run the vanilla JavaScript example:
First, build the package:
npm run buildStart the example server:
node examples/vanilla/server.jsOpen your browser and navigate to the URL shown in the console. The server will automatically find an available port, starting from 3000.
Note: The vanilla example uses ES modules, which require a web server to work properly due to CORS restrictions when loading modules from the file system.
React Example
The React example is provided as a reference implementation. To use it in a real React application:
Install the package:
npm install path/to/lore-headlessImport the components and hooks as shown in the example.
Package Structure
The package is structured to be truly headless, with React-specific functionality separated into a dedicated entry point:
- Main Entry Point: Core functionality that works with any framework
- React Entry Point: React-specific hooks that depend on React
This structure allows you to use the package with or without React, depending on your needs.
Development
To develop this package:
Install dependencies:
npm installBuild the package:
npm run buildRun tests:
npm test
API Reference
Core
LoreClient
A client for making API requests with support for different environments.
// Default usage (production mode)
const apiClient = new LoreClient();
// Specify a different mode
const stagingClient = new LoreClient('staging');
const devClient = new LoreClient('development');
// Use custom URLs (mode will be ignored)
const customClient = new LoreClient(
'production', // This parameter will be ignored
'https://your-custom-chat-url.com',
'https://your-custom-validation-url.com'
);Constructor Parameters:
mode(string, optional): Environment mode. Options:'production'(default),'staging','development'customChatUrl(string, optional): Custom chat URL. Must be provided together withcustomValidationUrlcustomValidationUrl(string, optional): Custom validation URL. Must be provided together withcustomChatUrl
Environment Modes:
- Production (
'production'): Uses production API endpoints - Staging (
'staging'): Uses staging API endpoints for testing - Development (
'development'): Uses local development endpoints (localhost)
Custom URLs:
When custom URLs are provided, both customChatUrl and customValidationUrl must be specified together. The mode parameter will be ignored in this case.
ChatStreamManager
A manager for handling chat streams.
const chatManager = new ChatStreamManager(apiClient, auth, callbacks);React Hooks
Message-based Approach (Enhanced)
The library now provides an enhanced version of the useChatStream hook that manages messages internally and returns them as part of the hook's return value. This simplifies message management and provides a more streamlined API.
import React from 'react';
import { LoreClient, LLMRole } from 'lore-headless';
import { useChatStream, useFeedback, ChatMessage } from 'lore-headless/react';
const ChatComponent = () => {
const [inputText, setInputText] = useState('');
// Use the enhanced useChatStream hook that returns messages
const {
messages, // Array of chat messages with different roles (user, assistant, system, error)
isStreaming, // Boolean indicating if a message is currently streaming
startChatStream, // Function to start a new chat stream
cancelStream, // Function to cancel the current stream
clearMessages // Function to clear all messages
} = useChatStream(
apiClient,
auth,
{
// You can still use the callbacks if needed
onSystemMessage: (text) => {
console.log('System message:', text);
},
}
);
const handleSendMessage = async () => {
if (!inputText.trim() || isStreaming) return;
setInputText('');
try {
await startChatStream(
{
user: 'user-id',
session: 'session-id',
query: inputText,
},
'graph-id'
);
} catch (error) {
console.error('Failed to start chat stream:', error);
}
};
return (
<div className="chat-container">
{/* Messages list */}
<div className="messages">
{messages.map((message) => (
<div
key={message.id}
className={`message ${message.role}`}
>
<div className="message-header">
{message.role.charAt(0).toUpperCase() + message.role.slice(1)}
{message.isStreaming && <span className="streaming-indicator" />}
</div>
<div className="message-content">{message.content}</div>
<div className="message-timestamp">
{message.timestamp.toLocaleTimeString()}
</div>
</div>
))}
</div>
{/* Message input form */}
<div className="input-container">
<input
type="text"
value={inputText}
onChange={(e) => setInputText(e.target.value)}
placeholder="Type your message..."
disabled={isStreaming}
/>
<button
onClick={handleSendMessage}
disabled={isStreaming || !inputText.trim()}
>
Send
</button>
{isStreaming && (
<button onClick={cancelStream}>
Cancel
</button>
)}
<button onClick={clearMessages}>
Clear Chat
</button>
</div>
</div>
);
};The ChatMessage interface provides a structured way to represent different types of messages:
interface ChatMessage {
id: string; // Unique identifier for the message
role: LLMRole | 'error'; // Role of the message sender (user, assistant, system, or error)
content: string; // Content of the message
timestamp: Date; // When the message was created
isStreaming?: boolean; // Whether the message is currently being streamed
}Benefits of the message-based approach:
- Automatic message management (no need to manually track messages)
- Built-in support for streaming indicators
- Consistent message structure with timestamps and IDs
- Support for different message types (user, assistant, system, error)
- Simplified API with fewer lines of code
useChatStream
A hook for using the chat stream in React.
import { useChatStream } from 'lore-headless/react';
// Callback-based approach (original)
const { startChatStream, cancelStream } = useChatStream(apiClient, auth, callbacks);
// Message-based approach (enhanced)
const { messages, isStreaming, startChatStream, cancelStream, clearMessages } = useChatStream(apiClient, auth, callbacks);useFeedback
A hook for sending feedback in React.
import { useFeedback } from 'lore-headless/react';
const { sendFeedback, loading, error, data } = useFeedback(apiClient, auth, graphId);License
MIT
5 months ago
5 months ago
5 months ago
5 months ago
5 months ago
6 months ago
6 months ago
6 months ago
6 months ago
6 months ago
6 months ago
6 months ago
7 months ago
7 months ago
7 months ago
7 months ago
7 months ago
8 months ago
8 months ago