0.0.3 • Published 6 months ago

grafana-openai-monitoring v0.0.3

Weekly downloads
-
License
GPL-3.0-or-later
Repository
github
Last release
6 months ago

OpenAI Monitoring: Monitor OpenAI API Usage with Grafana Cloud

Grafana GitHub Last Commit GitHub Contributors

Node.js Tests

grafana-openai-monitoring is an NPM Package that provides a way to monitor chat completions and Completions endpoints of the OpenAI API. It facilitates sending metrics and logs to Grafana Cloud, allowing you to track and analyze OpenAI API usage and responses.

Installation

You can install grafana-openai-monitoring using npm:

npm install grafana-openai-monitoring

Usage

The following tables shows which OpenAI function correspons to which monitoing function in this library

OpenAI FunctionMonitoring Function
openai.ChatCompletion.createchat_v2.monitor
openai.Completion.createchat_v1.monitor

ChatCompletions

To monitor ChatCompletions using the OpenAI API, you can use the chat_v2.monitor decorator. This decorator automatically tracks API calls and sends metrics and logs to the specified Grafana Cloud endpoints.

Here's how to set it up:

import OpenAI from 'openai';
import { chat_v2 } from 'grafana-openai-monitoring';

const openai = new OpenAI({
  apiKey: 'YOUR_OPENAI_API_KEY',
});

// Patch method
chat_v2.monitor(openai, {
  metrics_url: 'YOUR_PROMETHEUS_METRICS_URL',
  logs_url: 'YOUR_LOKI_LOGS_URL',
  metrics_username: 'YOUR_METRICS_USERNAME',
  logs_username: 'YOUR_LOGS_USERNAME',
  access_token: 'YOUR_ACCESS_TOKEN',
});

// Now any call to openai.ChatCompletion.create will be automatically tracked
async function main() {
  const completion = await openai.completions.create({
    model: 'gpt-4',
    max_tokens: 100,
    messages: [{ role: 'user', content: 'What is Grafana?' }],
  });
  console.log(completion);
}

main();

Completions

To monitor completions using the OpenAI API, you can use the chat_v1.monitor decorator. This decorator adds monitoring capabilities to the OpenAI API function and sends metrics and logs to the specified Grafana Cloud endpoints.

Here's how to apply it:

import OpenAI from 'openai';
import { chat_v1 } from 'grafana-openai-monitoring';

const openai = a new OpenAI({
  apiKey: 'YOUR_OPENAI_API_KEY',
});

// Patch method
chat_v1.monitor(openai, {
  metrics_url: 'YOUR_PROMETHEUS_METRICS_URL',
  logs_url: 'YOUR_LOKI_LOGS_URL',
  metrics_username: 'YOUR_METRICS_USERNAME',
  logs_username: 'YOUR_LOGS_USERNAME',
  access_token: 'YOUR_ACCESS_TOKEN',
});

// Now any call to openai.Completion.create will be automatically tracked
async function main() {
  const completion = await openai.completions.create({
    model: 'davinci',
    max_tokens: 100,
    prompt: 'Isn\'t Grafana the best?',
  });
  console.log(completion);
}

main();

Configuration

To use the grafana-openai-monitoring library effectively, you need to provide the following information:

  • YOUR_OPENAI_API_KEY: Replace this with your actual OpenAI API key.
  • YOUR_PROMETHEUS_METRICS_URL: Replace the URL with your Prometheus URL.
  • YOUR_LOKI_LOGS_URL: Replace with the URL where you want to send Loki logs.
  • YOUR_METRICS_USERNAME: Replace with the username for Prometheus.
  • YOUR_LOGS_USERNAME: Replace with the username for Loki.
  • YOUR_ACCESS_TOKEN: Replace with the Cloud Access Policy token required for authentication.

After configuring the parameters, the monitored API function will automatically log and track the requests and responses to the specified endpoints.

Compatibility

Node.js version 16 and above

Dependencies

License

This project is licensed under the GPL-3.0 license - see the LICENSE for details.

0.0.3

6 months ago

0.0.2

6 months ago

0.0.1

6 months ago