0.5.0 • Published 8 months ago
@agenite/bedrock v0.5.0
@agenite/bedrock
AWS Bedrock provider for Agenite, enabling seamless integration with Amazon's foundation models through the Bedrock runtime.
Features
- 🚀 Easy Integration - Simple setup with AWS Bedrock
- 🔄 Streaming Support - Real-time streaming responses
- 🛠️ Tool Integration - Native support for function calling
- 🎯 Model Selection - Support for multiple Bedrock models
- 🔐 AWS Authentication - Automatic credential management
- 🌐 Region Support - Configurable AWS regions
Installation
npm install @agenite/bedrockQuick Start
import { BedrockProvider } from '@agenite/bedrock';
// Initialize the provider
const provider = new BedrockProvider({
model: 'anthropic.claude-3-5-haiku-20241022-v1:0',
region: 'us-west-2',
});
// Generate a simple response
const result = await provider.generate(
'What are the main features of Llama 2?'
);
console.log(result);
// Use streaming for real-time responses
const generator = provider.stream('Tell me about AWS Bedrock.');
for await (const chunk of generator) {
if (chunk.type === 'text') {
process.stdout.write(chunk.text);
}
}Configuration
Provider Options
interface BedrockProviderOptions {
model: string; // Bedrock model ID
region: string; // AWS region
temperature?: number; // Temperature for response generation
maxTokens?: number; // Maximum tokens in response
topP?: number; // Top P sampling parameter
}Available Models
- Anthropic Claude Models:
anthropic.claude-3-5-haiku-20241022-v1:0anthropic.claude-3-sonnet-20240229-v1:0anthropic.claude-instant-v1
- Amazon Titan Models:
amazon.titan-text-express-v1amazon.titan-text-lite-v1
Advanced Usage
Usage with Claude 3.7 thinking
const provider = new BedrockProvider({
model: `us.anthropic.claude-3-7-sonnet-20250219-v1:0`,
region: 'us-east-2',
converseCommandConfig: {
additionalModelRequestFields: {
reasoning_config: {
type: 'enabled',
budget_tokens: 1024,
},
},
inferenceConfig: {
temperature: 1,
},
},
});Tool Integration
import { BedrockProvider } from '@agenite/bedrock';
import type { ToolDefinition } from '@agenite/llm';
// Define a calculator tool
const calculatorTool: ToolDefinition = {
name: 'calculator',
description: 'Perform basic arithmetic operations',
inputSchema: {
type: 'object',
properties: {
operation: {
type: 'string',
enum: ['add', 'subtract', 'multiply', 'divide'],
},
a: { type: 'number' },
b: { type: 'number' },
},
required: ['operation', 'a', 'b'],
},
};
// Initialize provider with tool support
const provider = new BedrockProvider({
model: 'anthropic.claude-3-5-haiku-20241022-v1:0',
region: 'us-west-2',
});
// Use tool in conversation
const messages = [
{
role: 'user',
content: [{ type: 'text', text: 'What is 123 multiplied by 456?' }],
},
];
const generator = provider.iterate(messages, {
tools: [calculatorTool],
stream: true,
systemPrompt:
'You are a helpful AI assistant with access to a calculator tool.',
});
// Process streaming response with tool usage
for await (const chunk of generator) {
if (chunk.type === 'text') {
process.stdout.write(chunk.text);
} else if (chunk.type === 'toolUse') {
console.log('Tool Use:', chunk);
}
}Conversation Management
// Maintain conversation history
let messages = [];
const result = await provider.generate(messages, {
systemPrompt: 'You are a helpful AI assistant.',
});
messages.push(
{ role: 'user', content: [{ type: 'text', text: 'Hello!' }] },
result
);API Reference
BedrockProvider Class
class BedrockProvider implements LLMProvider {
constructor(options: BedrockProviderOptions);
generate(
messages: string | BaseMessage[],
options?: GenerateOptions
): Promise<BaseMessage>;
stream(
messages: string | BaseMessage[],
options?: StreamOptions
): AsyncGenerator<StreamChunk>;
iterate(
messages: string | BaseMessage[],
options?: StreamOptions
): AsyncGenerator<StreamChunk>;
}Message Types
interface BaseMessage {
role: 'user' | 'assistant' | 'system';
content: ContentBlock[];
}
type ContentBlock = TextBlock | ToolUseBlock | ToolResultBlock;Examples
Check out the examples directory for more:
basic-chat.ts- Simple chat interactiontool-usage.ts- Advanced tool integration example
AWS Setup
- Install AWS CLI and configure credentials:
aws configureEnsure your AWS account has access to Bedrock and the required models.
Set up IAM permissions for Bedrock access:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"bedrock:InvokeModel",
"bedrock:InvokeModelWithResponseStream"
],
"Resource": "*"
}
]
}Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
MIT
0.5.0
8 months ago
0.4.0
8 months ago
0.4.0-alpha.0
8 months ago
0.3.1
9 months ago
0.3.0
9 months ago
0.2.3
10 months ago
0.2.2
10 months ago
0.2.1
10 months ago
0.2.0
10 months ago
0.1.4
10 months ago
0.1.3
10 months ago
0.1.2
10 months ago
0.1.1
10 months ago
0.1.0
10 months ago
0.0.2-alpha.2
10 months ago
0.0.2-alpha.1
10 months ago
0.0.2-alpha.0
10 months ago
0.1.0-alpha.0
10 months ago