@langtrase/typescript-sdk v2.2.0
Langtrace
Open Source & Open Telemetry(OTEL) Observability for LLM applications
Langtrace is an open source observability software which lets you capture, debug and analyze traces and metrics from all your applications that leverages LLM APIs, Vector Databases and LLM based Frameworks.
Open Telemetry Support
The traces generated by Langtrace adhere to Open Telemetry Standards(OTEL). We are developing semantic conventions for the traces generated by this project. You can checkout the current definitions in this repository. Note: This is an ongoing development and we encourage you to get involved and welcome your feedback.
Langtrace Cloud ☁️
To use the managed SaaS version of Langtrace, follow the steps below:
- Sign up by going to this link.
- Create a new Project after signing up. Projects are containers for storing traces and metrics generated by your application. If you have only one application, creating 1 project will do.
- Generate an API key by going inside the project.
- In your application, install the Langtrace SDK and initialize it with the API key you generated in the step 3.
- The code for installing and setting up the SDK is shown below
Getting Started
Get started by adding simply three lines to your code!
npm i @langtrase/typescript-sdk
import * as Langtrace from '@langtrase/typescript-sdk' // Must precede any llm module imports
Langtrace.init({ api_key: <your_api_key> })
OR
import * as Langtrace from '@langtrase/typescript-sdk' // Must precede any llm module imports
LangTrace.init() // LANGTRACE_API_KEY as an ENVIRONMENT variable
Langtrace Self Hosted
Get started by adding simply two lines to your code and see traces being logged to the console!
npm i @langtrase/typescript-sdk
import * as Langtrace from '@langtrase/typescript-sdk' // Must precede any llm module imports
Langtrace.init({ write_to_langtrace_cloud: false})
Langtrace self hosted custom exporter
Get started by adding simply three lines to your code and see traces being exported to your remote location!
npm i @langtrase/typescript-sdk
import * as Langtrace from '@langtrase/typescript-sdk' // Must precede any llm module imports
Langtrace.init({ custom_remote_exporter: <your_exporter>, batch:<true or false>})
Additional Customization
WithLangTraceRootSpan
- this function is designed to organize and relate different spans, in a hierarchical manner. When you're performing multiple operations that you want to monitor together as a unit, this function helps by establishing a "parent" (LangtraceRootSpan
or whatever is passed toname
) span. Then, any calls to the LLM APIs made within the given function (fn) will be considered "children" of this parent span. This setup is especially useful for tracking the performance or behavior of a group of operations collectively, rather than individually.
*
* @param fn The function to be executed within the context of the root span
* @param name Name of the root span
* @param spanAttributes Attributes to be added to the root span
* @param spanKind The kind of span to be created
* @returns result of the function
*/
export async function withLangTraceRootSpan<T> (
fn: () => Promise<T>,
name = 'LangtraceRootSpan',
spanKind: SpanKind = SpanKind.INTERNAL
): Promise<T>
withAdditionalAttributes
- this function is designed to enhance the traces by adding custom attributes to the current context. These custom attributes provide extra details about the operations being performed, making it easier to analyze and understand their behavior.
/**
*
* @param fn function to be executed within the context with the custom attributes added to the current context
* @param attributes custom attributes to be added to the current context
* @returns result of the function
*/
export async function withAdditionalAttributes <T> (fn: () => Promise<T>, attributes: Partial<LLMSpanAttributes>): Promise<T>
Supported integrations
Langtrace automatically captures traces from the following vendors:
Vendor | Type | Typescript SDK | Python SDK |
---|---|---|---|
OpenAI | LLM | :white_check_mark: | :white_check_mark: |
Anthropic | LLM | :white_check_mark: | :white_check_mark: |
Azure OpenAI | LLM | :white_check_mark: | :white_check_mark: |
Cohere | LLM | :white_check_mark: | :white_check_mark: |
Groq | LLM | :white_check_mark: | :white_check_mark: |
Langchain | Framework | :x: | :white_check_mark: |
LlamaIndex | Framework | :white_check_mark: | :white_check_mark: |
Pinecone | Vector Database | :white_check_mark: | :white_check_mark: |
ChromaDB | Vector Database | :white_check_mark: | :white_check_mark: |
QDrant | Vector Database | :white_check_mark: | :white_check_mark: |
Feature Requests and Issues
- To request for features, head over here to start a discussion.
- To raise an issue, head over here and create an issue.
Contributions
We welcome contributions to this project. To get started, fork this repository and start developing. To get involved, join our Discord workspace.
Security
To report security vulnerabilites, email us at security@scale3labs.com. You can read more on security here.
License
21 hours ago
1 day ago
7 days ago
16 days ago
16 days ago
28 days ago
1 month ago
1 month ago
1 month ago
1 month ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago
2 months ago