0.1.8 • Published 6 months ago

@gsb-core/ai-assistant v0.1.8

Weekly downloads
-
License
MIT
Repository
github
Last release
6 months ago

GSB AI Assistant

The GSB AI Assistant package provides integrations with various AI services and models through the GSB Core platform. It includes components for chat-based interactions using different LLM providers.

Features

  • AI Chat Service for communicating with LLM providers
  • Support for multiple LLM providers (OpenAI, Azure OpenAI, Anthropic, HuggingFace)
  • Template-based prompt processing
  • Chat history management
  • Entity interfaces for typed data

Installation

The package is designed to work within the GSB workspace using the main node_modules directory:

# From the GSB workspace root:
cd packages/gsb-ai-assistant
npm run setup

Usage

AI Chat Service

The primary component is the GsbAiChatService which provides methods for interacting with AI chat models:

import { GsbAiChatService } from '@gsb-core/ai-assistant';

// Get singleton instance
const aiChatService = GsbAiChatService.getInstance();

// Set the runtime service (required for running serverless functions)
aiChatService.setRuntime(runtimeService);

// Start a new chat
const result = await aiChatService.chat(
  "What's the status of our project?",
  "llm-config-id",  // ID of saved LlmConfiguration
  undefined,        // New chat
  { entity: projectEntity }
);

// Continue the conversation
const followUp = await aiChatService.chat(
  "What should be our next steps?",
  "llm-config-id",
  result.chat.id,   // Use existing chat ID
  { entity: projectEntity }
);

Chat History Management

The service includes methods for retrieving chat history:

// Get a specific chat by ID
const chat = await aiChatService.getChat('chat-id');

// Get all chats
const allChats = await aiChatService.getChats();

// Get messages for a specific chat
const messages = await aiChatService.getChatMessages('chat-id');

Entity Interfaces

The package includes TypeScript interfaces that match the backend entity definitions:

import { GsbAiChat, GsbAiMessage, LlmConfiguration, LLMProvider } from '@gsb-core/ai-assistant';

// Create a new LLM configuration
const config: LlmConfiguration = {
  title: 'My GPT-4 Configuration',
  provider: LLMProvider.OpenAI,
  template: 'Given the following context: {{ context }}\n\nQuestion: {{ prompt }}',
  settings: {
    model: 'gpt-4',
    temperature: 0.7
  }
};

Server-side Integration

This package relies on the following GSB backend components:

  1. Entity definitions for GsbAiChat, GsbAiMessage, and LlmConfiguration
  2. The aiChat serverless function for processing chat requests

Development

Building the Package

npm run build

Running Tests

npm test

License

MIT

Entity Definitions

The package includes TypeScript interfaces for the following entity definitions:

  • GsbAiChat: Represents an AI chat session
  • GsbAiMessage: Represents a message in the chat
  • LlmConfiguration: Configuration for LLM models

LLM Configuration Templates

The LLM configurations support nunjucks.js templates, allowing you to create dynamic prompts based on entity data. The backend automatically processes these templates.

Creating an LLM Configuration

Here's an example of creating and saving an LLM configuration with a template:

import { LLMProvider, LlmConfiguration } from '@gsb-ai-assistant/core';
import { GsbEntityService } from '@gsb-core/core';

// Initialize entity service
const entityService = new GsbEntityService(runtime);

// Create a new LLM configuration
const llmConfig: LlmConfiguration = {
  title: "Project Assistant Configuration",
  provider: LLMProvider.OpenAI,
  template: `
You are assisting with project information.

User prompt: {{ prompt }}

Project details:
{% if entity.title %}
- Project title: {{ entity.title }}
{% endif %}
{% if entity.teamSize %}
- Team size: {{ entity.teamSize }} members
{% endif %}
{% if entity.budget %}
- Budget: ${{ entity.budget }}
{% endif %}

Provide a helpful response based on this information.
  `,
  settings: {
    model: "gpt-4",
    temperature: 0.7,
    maxTokens: 1000
  }
};

// Save the configuration
const saveResult = await entityService.save({
  entityDef: { name: 'LlmConfiguration' },
  entity: llmConfig
});

// Get the configuration ID for later use
const llmConfigId = saveResult.entityId;
console.log(`Created LLM configuration with ID: ${llmConfigId}`);

Template Variables

Templates have access to:

  • prompt: The original user prompt
  • chat: The current chat object
  • data: The context data object
  • Any properties of the context data (spread at top level)

Template Examples

Basic template with entity data:

You are assisting with project information.

User prompt: {{ prompt }}

Project details:
{% if entity.title %}
- Project title: {{ entity.title }}
{% endif %}
{% if entity.teamSize %}
- Team size: {{ entity.teamSize }} members
{% endif %}
{% if entity.deadline %}
- Deadline: {{ entity.deadline }}
{% endif %}

Provide a helpful response based on this information.

Conditional template with data processing:

{% if entity.type == 'customer' %}
You are helping a customer with support questions.
{% elif entity.type == 'employee' %}
You are assisting an employee with internal processes.
{% else %}
You are providing general information.
{% endif %}

{{ prompt }}

{% if entity.previousInteractions %}
Previous interactions:
{% for interaction in entity.previousInteractions %}
- {{ interaction.date }}: {{ interaction.summary }}
{% endfor %}
{% endif %}

Using Templates

  1. Create an LLM configuration entity with your template in the template field
  2. Save the configuration to get its ID
  3. Use that ID when calling the chat method:
const result = await aiChatService.chat(
  userPrompt,
  llmConfigurationId, // The ID of your saved configuration
  chatId,
  contextData
);

Serverless Function Template

The package includes a reference implementation of the aiChat serverless function that should be deployed to the GSB backend. This function handles:

  • Processing AI chat requests
  • Managing chat history
  • Template rendering (via backend nunjucks)
  • Integration with LLM providers

To use it, you need to:

  1. Install AI Assitant module in your GSB backend workspace
  2. Configure appropriate LLM providers cridentials via module config
  3. Create LLM configurations with templates

Publishing to NPM

To publish the package to npm:

  1. Update the version number in package.json
  2. Run the publish script:
cd packages/gsb-ai-assistant
npm run publish-package

This will:

  • Clean the dist directory
  • Run linting and tests
  • Build the package
  • Copy necessary files
  • Publish to npm

Service ID

The AI Chat Service is registered with ID: 8562037a-8f11-4eae-ba97-6b7ccd852e57

0.1.8

6 months ago

0.1.7

6 months ago

0.1.5

6 months ago

0.1.3

6 months ago

0.1.2

6 months ago

0.1.1

6 months ago

0.1.0

6 months ago