0.3.6 • Published 6 months ago

@aigne/transport v0.3.6

Weekly downloads
-
License
Elastic-2.0
Repository
github
Last release
6 months ago

@aigne/transport

GitHub star chart Open Issues codecov NPM Version Elastic-2.0 licensed

English | 中文

AIGNE Transport SDK,为 AIGNE 框架 中的 AIGNE 组件之间的通信提供 HTTP 客户端和服务器实现。

简介

@aigne/transport 为 AIGNE 组件提供了一个强大的通信层,使 AI 应用程序的不同部分之间能够无缝交互。该包提供了遵循一致协议的 HTTP 客户端和服务器实现,使用 AIGNE 框架构建分布式 AI 系统变得简单。

特性

  • HTTP 客户端实现:易于使用的客户端,用于与 AIGNE 服务器通信
  • HTTP 服务器实现:灵活的服务器实现,可与流行的 Node.js 框架集成
  • 框架无关:支持 Express、Hono 和其他 Node.js HTTP 框架
  • 流式响应支持:对流式响应的一流支持
  • 类型安全:为所有 API 提供全面的 TypeScript 类型定义
  • 错误处理:健壮的错误处理机制,提供详细的错误信息
  • 中间件支持:兼容常见的 HTTP 中间件,如压缩中间件

安装

使用 npm

npm install @aigne/transport @aigne/core

使用 yarn

yarn add @aigne/transport @aigne/core

使用 pnpm

pnpm add @aigne/transport @aigne/core

基本用法

服务端用法

AIGNE HTTP 服务器可用于 Express 或 Hono 框架。

Express 示例

import { AIAgent, AIGNE } from "@aigne/core";
import { AIGNEHTTPClient } from "@aigne/transport/http-client/index.js";
import { AIGNEHTTPServer } from "@aigne/transport/http-server/index.js";
import express from "express";
import { OpenAIChatModel } from "../_mocks_/mock-models.js";

const model = new OpenAIChatModel();

const chat = AIAgent.from({
  name: "chat",
});

// AIGNE: Main execution engine of AIGNE Framework.
const aigne = new AIGNE({ model, agents: [chat] });

// Create an AIGNEServer instance
const aigneServer = new AIGNEHTTPServer(aigne);

// Setup the server to handle incoming requests
const server = express();
server.post("/aigne/invoke", async (req, res) => {
  await aigneServer.invoke(req, res);
});
const httpServer = server.listen(port);

// Create an AIGNEClient instance
const client = new AIGNEHTTPClient({ url });

// Invoke the agent by client
const response = await client.invoke("chat", { $message: "hello" });

console.log(response); // Output: {$message: "Hello world!"}

Hono 示例

import { AIAgent, AIGNE } from "@aigne/core";
import { AIGNEHTTPClient } from "@aigne/transport/http-client/index.js";
import { AIGNEHTTPServer } from "@aigne/transport/http-server/index.js";
import { serve } from "bun";
import { Hono } from "hono";
import { OpenAIChatModel } from "../_mocks_/mock-models.js";

const model = new OpenAIChatModel();

const chat = AIAgent.from({
  name: "chat",
});

// AIGNE: Main execution engine of AIGNE Framework.
const aigne = new AIGNE({ model, agents: [chat] });

// Create an AIGNEServer instance
const aigneServer = new AIGNEHTTPServer(aigne);

// Setup the server to handle incoming requests
const honoApp = new Hono();
honoApp.post("/aigne/invoke", async (c) => {
  return aigneServer.invoke(c.req.raw);
});
const server = serve({ port, fetch: honoApp.fetch });

// Create an AIGNEClient instance
const client = new AIGNEHTTPClient({ url });

// Invoke the agent by client
const response = await client.invoke("chat", { $message: "hello" });
console.log(response); // Output: {$message: "Hello world!"}

HTTP 客户端

import { AIGNEHTTPClient } from "@aigne/transport/http-client/index.js";

const client = new AIGNEHTTPClient({ url });

const response = await client.invoke("chat", { $message: "hello" });

console.log(response); // Output: {$message: "Hello world!"}

流式响应

import { AIGNEHTTPClient } from "@aigne/transport/http-client/index.js";

const client = new AIGNEHTTPClient({ url });

const stream = await client.invoke(
  "chat",
  { $message: "hello" },
  { streaming: true },
);

let text = "";
for await (const chunk of stream) {
  if (chunk.delta.text?.$message) text += chunk.delta.text.$message;
}

console.log(text); // Output: "Hello world!"

许可证

Elastic-2.0

0.3.6

6 months ago

0.3.5

7 months ago

0.3.4

7 months ago

0.3.3

7 months ago

0.3.2

7 months ago

0.3.1

7 months ago

0.3.0

7 months ago

0.2.0

7 months ago

0.1.0

7 months ago