@gsb-core/ai-assistant v0.1.8
GSB AI Assistant
The GSB AI Assistant package provides integrations with various AI services and models through the GSB Core platform. It includes components for chat-based interactions using different LLM providers.
Features
- AI Chat Service for communicating with LLM providers
- Support for multiple LLM providers (OpenAI, Azure OpenAI, Anthropic, HuggingFace)
- Template-based prompt processing
- Chat history management
- Entity interfaces for typed data
Installation
The package is designed to work within the GSB workspace using the main node_modules directory:
# From the GSB workspace root:
cd packages/gsb-ai-assistant
npm run setupUsage
AI Chat Service
The primary component is the GsbAiChatService which provides methods for interacting with AI chat models:
import { GsbAiChatService } from '@gsb-core/ai-assistant';
// Get singleton instance
const aiChatService = GsbAiChatService.getInstance();
// Set the runtime service (required for running serverless functions)
aiChatService.setRuntime(runtimeService);
// Start a new chat
const result = await aiChatService.chat(
"What's the status of our project?",
"llm-config-id", // ID of saved LlmConfiguration
undefined, // New chat
{ entity: projectEntity }
);
// Continue the conversation
const followUp = await aiChatService.chat(
"What should be our next steps?",
"llm-config-id",
result.chat.id, // Use existing chat ID
{ entity: projectEntity }
);Chat History Management
The service includes methods for retrieving chat history:
// Get a specific chat by ID
const chat = await aiChatService.getChat('chat-id');
// Get all chats
const allChats = await aiChatService.getChats();
// Get messages for a specific chat
const messages = await aiChatService.getChatMessages('chat-id');Entity Interfaces
The package includes TypeScript interfaces that match the backend entity definitions:
import { GsbAiChat, GsbAiMessage, LlmConfiguration, LLMProvider } from '@gsb-core/ai-assistant';
// Create a new LLM configuration
const config: LlmConfiguration = {
title: 'My GPT-4 Configuration',
provider: LLMProvider.OpenAI,
template: 'Given the following context: {{ context }}\n\nQuestion: {{ prompt }}',
settings: {
model: 'gpt-4',
temperature: 0.7
}
};Server-side Integration
This package relies on the following GSB backend components:
- Entity definitions for
GsbAiChat,GsbAiMessage, andLlmConfiguration - The
aiChatserverless function for processing chat requests
Development
Building the Package
npm run buildRunning Tests
npm testLicense
MIT
Entity Definitions
The package includes TypeScript interfaces for the following entity definitions:
GsbAiChat: Represents an AI chat sessionGsbAiMessage: Represents a message in the chatLlmConfiguration: Configuration for LLM models
LLM Configuration Templates
The LLM configurations support nunjucks.js templates, allowing you to create dynamic prompts based on entity data. The backend automatically processes these templates.
Creating an LLM Configuration
Here's an example of creating and saving an LLM configuration with a template:
import { LLMProvider, LlmConfiguration } from '@gsb-ai-assistant/core';
import { GsbEntityService } from '@gsb-core/core';
// Initialize entity service
const entityService = new GsbEntityService(runtime);
// Create a new LLM configuration
const llmConfig: LlmConfiguration = {
title: "Project Assistant Configuration",
provider: LLMProvider.OpenAI,
template: `
You are assisting with project information.
User prompt: {{ prompt }}
Project details:
{% if entity.title %}
- Project title: {{ entity.title }}
{% endif %}
{% if entity.teamSize %}
- Team size: {{ entity.teamSize }} members
{% endif %}
{% if entity.budget %}
- Budget: ${{ entity.budget }}
{% endif %}
Provide a helpful response based on this information.
`,
settings: {
model: "gpt-4",
temperature: 0.7,
maxTokens: 1000
}
};
// Save the configuration
const saveResult = await entityService.save({
entityDef: { name: 'LlmConfiguration' },
entity: llmConfig
});
// Get the configuration ID for later use
const llmConfigId = saveResult.entityId;
console.log(`Created LLM configuration with ID: ${llmConfigId}`);Template Variables
Templates have access to:
prompt: The original user promptchat: The current chat objectdata: The context data object- Any properties of the context data (spread at top level)
Template Examples
Basic template with entity data:
You are assisting with project information.
User prompt: {{ prompt }}
Project details:
{% if entity.title %}
- Project title: {{ entity.title }}
{% endif %}
{% if entity.teamSize %}
- Team size: {{ entity.teamSize }} members
{% endif %}
{% if entity.deadline %}
- Deadline: {{ entity.deadline }}
{% endif %}
Provide a helpful response based on this information.Conditional template with data processing:
{% if entity.type == 'customer' %}
You are helping a customer with support questions.
{% elif entity.type == 'employee' %}
You are assisting an employee with internal processes.
{% else %}
You are providing general information.
{% endif %}
{{ prompt }}
{% if entity.previousInteractions %}
Previous interactions:
{% for interaction in entity.previousInteractions %}
- {{ interaction.date }}: {{ interaction.summary }}
{% endfor %}
{% endif %}Using Templates
- Create an LLM configuration entity with your template in the
templatefield - Save the configuration to get its ID
- Use that ID when calling the
chatmethod:
const result = await aiChatService.chat(
userPrompt,
llmConfigurationId, // The ID of your saved configuration
chatId,
contextData
);Serverless Function Template
The package includes a reference implementation of the aiChat serverless function that should be deployed to the GSB backend. This function handles:
- Processing AI chat requests
- Managing chat history
- Template rendering (via backend nunjucks)
- Integration with LLM providers
To use it, you need to:
- Install AI Assitant module in your GSB backend workspace
- Configure appropriate LLM providers cridentials via module config
- Create LLM configurations with templates
Publishing to NPM
To publish the package to npm:
- Update the version number in
package.json - Run the publish script:
cd packages/gsb-ai-assistant
npm run publish-packageThis will:
- Clean the
distdirectory - Run linting and tests
- Build the package
- Copy necessary files
- Publish to npm
Service ID
The AI Chat Service is registered with ID: 8562037a-8f11-4eae-ba97-6b7ccd852e57