0.0.15-alpha.0 • Published 11 months ago

@devlink/ai v0.0.15-alpha.0

Weekly downloads
-
License
ISC
Repository
github
Last release
11 months ago

ai

This npm package, @devlink/ai, provides a high-level API to embed code documents using OpenAI's language model.

This package allows you to:

  • Load code documents from a directory
  • Embed documents using OpenAI embeddings
  • Organize embedded documents in a vector store for easy searching and retrieval
  • Support azureOpenAI and openai

Install

Install the package with npm:

npm install @devlink/ai

Usage

The following is a sample usage of the package:

import { embeddingCode, llmConfig } from '@devlink/ai';

const openAIConfig: llmConfig = {
  openAIApiKey: 'your-openai-api-key',
};

const openAIEmbeddingConfig: llmConfig = {
  openAIApiKey: 'your-openai-api-key',
};

const directoryPath = './path/to/your/documents';
const fileTypeArray = ['ts', 'js', 'rs'];

const { agent } = await embeddingCode({
  directoryPath: path,
  fileTypeArray,
  openAIConfig,
  openAIEmbeddingConfig,
});
const input = 'Explain the meaning of these codes step by step.';
const result = await agent.call({ input });

Use in @devlink/cli

devlink/ai-examples

API

embeddingCode

The embeddingCode function is an asynchronous function that allows you to load, embed, and organize textual documents from a directory.

export const embeddingCode = async ({
  directoryPath,
  fileTypeArray,
  openAIConfig,
  openAIEmbeddingConfig,
}: ExplainCodeOptions) => { ... }

Parameters:

  • directoryPath (string): The path of the directory containing the documents to be loaded.
  • fileTypeArray (string[]): The file types to be loaded.
  • openAIConfig (llmConfig): Configuration object for OpenAI language model.
  • openAIEmbeddingConfig (llmConfig): Configuration object for OpenAI Embedding model.

Return:

The function returns a Promise that resolves with an object containing the agent for the created vector store.

llmConfig

The llmConfig is an interface that represents the configuration needed for OpenAI language model.

export type llmConfig = Partial<OpenAIInput> & Partial<AzureOpenAIInput> & BaseLLMParams;

Examples

The following code can be used as an example:

import { embeddingCode, llmConfig } from '@devlink/ai';

const openAIConfig: llmConfig = {
  openAIApiKey: 'your-openai-api-key',
};

const azureOpenAIConfig: llmConfig = {
  azureOpenAIApiVersion: '2022-12-01',
  azureOpenAIApiKey: 'your-azure-openai-api-key',
  azureOpenAIApiInstanceName: 'your-azure-openai-api-instance-name',
  azureOpenAIApiDeploymentName: 'your-azure-openai-api-deployment-name',
  azureOpenAIApiEmbeddingsDeploymentName: 'your-azure-openai-api-embeddings-deployment-name',
};

// const openAIConfig = openaiConfig or azureOpenAIConfig;

const directoryPath = './path/to/your/documents';
const fileTypeArray = ['ts', 'js', 'rs'];

const openAIEmbeddingConfig: llmConfig = {
  openAIApiKey,
};

const directoryPath = './path/to/your/documents';
const fileTypeArray = ['ts', 'js', 'rs'];

const { agent } = await embeddingCode({
  directoryPath: path,
  fileTypeArray,
  openAIConfig,
  openAIEmbeddingConfig,
});
const input = 'Explain the meaning of these codes step by step.';
const result = await agent.call({ input });
0.0.15-alpha.0

11 months ago

0.0.14-alpha.0

11 months ago

0.0.13-alpha.0

11 months ago

0.0.11-alpha.0

11 months ago