0.10.0 • Published 12 days ago

genkitx-azure-openai v0.10.0

Weekly downloads
-
License
Apache-2.0
Repository
github
Last release
12 days ago

Firebase Genkit + Azure OpenAI

genkitx-azure-openai is a community plugin for using Azure OpenAI APIs with Firebase Genkit. Built by The Fire Company. 🔥

Installation

Install the plugin in your project with your favorite package manager:

  • npm install genkitx-azure-openai
  • yarn add genkitx-azure-openai
  • pnpm add genkitx-azure-openai

Usage

The interface to the models of this plugin is the same as for the OpenAI plugin.

Initialize

You'll also need to have an Azure OpenAI instance deployed. You can deploy a version on Azure Portal following this guide.

Once you have your instance running, make sure you have the endpoint and key. You can find them in the Azure Portal, under the "Keys and Endpoint" section of your instance.

You can then define the following environment variables to use the service:

AZURE_OPENAI_API_ENDPOINT=<YOUR_ENDPOINT>
AZURE_OPENAI_API_KEY=<YOUR_KEY>
AZURE_OPENAI_API_EMBEDDING_DEPLOYMENT_NAME=<YOUR_EMBEDDING_DEPLOYMENT

Alternatively, you can pass the values directly to the azureOpenAI constructor:

import { azureOpenAI } from 'genkitx-azure-openai';

export default configureGenkit({
  plugins: [
    azureOpenAI({
      apiKey: '<your_key>',
      azureOpenAIEndpoint: '<your_endpoint>',
      azureOpenAIApiDeploymentName: '<your_embedding_deployment_name',
    }),
    // other plugins
  ],
});

If you're using Azure Managed Identity, you can also pass the credentials directly to the constructor:

import { azureOpenAI } from 'genkitx-azure-openai';
import { DefaultAzureCredential } from '@azure/identity';

const credential = new DefaultAzureCredential();

export default configureGenkit({
  plugins: [
    azureOpenAI({
      credential,
      azureOpenAIEndpoint: '<your_endpoint>',
      azureOpenAIApiDeploymentName: '<your_embedding_deployment_name',
    }),
    // other plugins
  ],
});

Basic examples

The simplest way to call the text generation model is by using the helper function generate:

// Basic usage of an LLM
const response = await generate({
  model: gpt35Turbo,
  prompt: 'Tell me a joke.',
});

console.log(await response.text());

Using the same interface, you can prompt a multimodal model:

const response = await generate({
  model: gpt4o,
  prompt: [
    { text: 'What animal is in the photo?' },
    { media: { url: imageUrl } },
  ],
  config: {
    // control of the level of visual detail when processing image embeddings
    // Low detail level also decreases the token usage
    visualDetailLevel: 'low',
  },
});
console.log(await response.text());

For more detailed examples and the explanation of other functionalities, refer to the examples in the official Github repo of the plugin or in the official Genkit documentation.

Contributing

Want to contribute to the project? That's awesome! Head over to our Contribution Guidelines.

Need support?

!NOTE
This repository depends on Google's Firebase Genkit. For issues and questions related to Genkit, please refer to instructions available in Genkit's repository.

Reach out by opening a discussion on Github Discussions.

Credits

This plugin is proudly maintained by the team at The Fire Company. 🔥

License

This project is licensed under the Apache 2.0 License.

License: Apache 2.0

0.10.0

12 days ago

0.9.0

19 days ago

0.8.1

20 days ago

0.8.0

27 days ago

0.3.0

29 days ago

0.2.0

1 month ago

0.1.4

1 month ago