1.1.0 • Published 6 months ago
@maximai/maxim-js-langchain v1.1.0
Maxim SDK : Langchain wrapper
This is the langchain wrapper built for Maxim JS SDK.
Simplifies integrating Maxim logger with langchain
Install
npm install @maximai/maxim-js-langchainInitialize Maxim logger
const maxim = new Maxim({apiKey: "maxim-api-key"});
const logger = await maxim.logger({ id: "log-repository-id" });Start logging
const maximTracer = new MaximLangchainTracer(logger);
const llm = new ChatOpenAI({
openAIApiKey: openAIKey,
modelName: "gpt-4o",
temperature: 0,
callbacks: [maximTracer],
metadata: {
maxim: { // optional: add metadata to the generation
generationName: "maths-gen",
generationTags: { tag: "test" }
}
}
});
const query = "What's the sum of 3 and 2?";
const result = await llm.invoke(query);Langchain module compatibility
| Anthropic | Bedrock Anthropic | Bedrock Meta | OpenAI | Azure | |
|---|---|---|---|---|---|
| Chat (0.3.x) | ✅ | ✅ | ✅ | ✅ | ✅ |
| Tool call (0.3.x) | ✅ | ✅ | ⛔️ | ✅ | ✅ |
| Chain (via LLM) (0.3.x) | ✅ | ✅ | ✅ | ✅ | ✅ |
| Streaming (0.3.x) | ✅ | ✅ | ✅ | ✳️ | ✳️ |
| Agent (0.3.x) | ⛔️ | ⛔️ | ⛔️ | ⛔️ | ⛔️ |
✳️ Token usage is not supported by Langchain
Version changelog
1.1.0
- fix: prevention of creation of multiple traces for a single langchain call
1.0.0
- Feat: adds LangGraph support
0.2.7
- Feat: add support for tool call logging via callbacks
0.2.6
- Fix: token usage capturing for anthropic models
0.2.5
- Fix: model name and provider parsing while using Langchain
0.2.4
- Fix: Peer dependency changed from langchain to @langchain/core
- Chore: tsconfig target moved to es2017
0.2.3
- Feat: Updates peer dep for Maxim SDK
0.2.2
- Feat: Adds support for langchain 3.x
0.2.1
- Improvement: easy logging of retrieval and generation
0.1.1
- Breaking: Dropping support for Langchain versions below 0.3.x
0.0.4
- Improvement: Changed pining for core Maxim SDK
0.0.3
- Fix: Token usage were not getting captured for streaming responses
0.0.2
- Fix: Adds empty tags as {}
0.0.1
- Early preview