0.11.0 • Published 12 months ago

@speakeasy-sdks/mistral v0.11.0

Weekly downloads
-
License
-
Repository
github
Last release
12 months ago

The Mistral Typescript library provides convenient access to the Mistral REST API from any Typescript or Javascript application. The library includes type definitions for all request params and response fields, and offers both synchronous and asynchronous clients powered by fetch

SDK Installation

NPM

npm add @speakeasy-sdks/mistral

PNPM

pnpm add @speakeasy-sdks/mistral

Bun

bun add @speakeasy-sdks/mistral

Yarn

yarn add @speakeasy-sdks/mistral zod

# Note that Yarn does not install peer dependencies automatically. You will need
# to install zod as shown above.

Requirements

For supported JavaScript runtimes, please consult RUNTIMES.md.

SDK Example Usage

Create Chat Completions

This example shows how to create chat completions.

import { Mistral } from "@speakeasy-sdks/mistral";

const mistral = new Mistral({
    apiKeyAuth: "<YOUR_BEARER_TOKEN_HERE>",
});

async function run() {
    const result = await mistral.chat.stream({
        model: "mistral-small-latest",
        messages: [
            {
                role: "user",
                content: "Who is the best French painter? Answer in JSON.",
            },
        ],
        responseFormat: {
            type: "json_object",
        },
        maxTokens: 512,
        randomSeed: 1337,
    });

    for await (const event of result) {
        // Handle the event
    }
}

run();

Available Resources and Operations

chat

  • stream - Create Chat Completions Stream
  • create - Create Chat Completions

fim

  • create - Create FIM Completions

embeddings

models

  • list - List Available Models

files

fineTuning

Server-sent event streaming

Server-sent events are used to stream content from certain operations. These operations will expose the stream as an async iterable that can be consumed using a for await...of loop. The loop will terminate when the server no longer has any events to send and closes the underlying connection.

import { Mistral } from "@speakeasy-sdks/mistral";

const mistral = new Mistral({
    apiKeyAuth: "<YOUR_BEARER_TOKEN_HERE>",
});

async function run() {
    const result = await mistral.chat.stream({
        model: "mistral-small-latest",
        messages: [
            {
                role: "user",
                content: "Who is the best French painter? Answer in JSON.",
            },
        ],
        responseFormat: {
            type: "json_object",
        },
        maxTokens: 512,
        randomSeed: 1337,
    });

    for await (const event of result) {
        // Handle the event
    }
}

run();

Error Handling

All SDK methods return a response object or throw an error. If Error objects are specified in your OpenAPI Spec, the SDK will throw the appropriate Error type.

Error ObjectStatus CodeContent Type
models.BadRequest400application/json
models.Unauthorized401application/json
models.Forbidden403application/json
models.NotFound404application/json
models.TooManyRequests429application/json
models.InternalServerError500application/json
models.ServiceUnavailable503application/json
models.SDKError4xx-5xx/

Validation errors can also occur when either method arguments or data returned from the server do not match the expected format. The SDKValidationError that is thrown as a result will capture the raw value that failed validation in an attribute called rawValue. Additionally, a pretty() method is available on this error that can be used to log a nicely formatted string since validation errors can list many issues and the plain error string may be difficult read when debugging.

import { Mistral } from "@speakeasy-sdks/mistral";
import { SDKValidationError } from "@speakeasy-sdks/mistral/models";

const mistral = new Mistral({
    apiKeyAuth: "<YOUR_BEARER_TOKEN_HERE>",
});

async function run() {
    let result;
    try {
        result = await mistral.chat.stream({
            model: "mistral-small-latest",
            messages: [
                {
                    role: "user",
                    content: "Who is the best French painter? Answer in JSON.",
                },
            ],
            responseFormat: {
                type: "json_object",
            },
            maxTokens: 512,
            randomSeed: 1337,
        });
    } catch (err) {
        switch (true) {
            case err instanceof SDKValidationError: {
                // Validation errors can be pretty-printed
                console.error(err.pretty());
                // Raw value may also be inspected
                console.error(err.rawValue);
                return;
            }
            case err instanceof models.BadRequest: {
                console.error(err); // handle exception
                return;
            }
            case err instanceof models.Unauthorized: {
                console.error(err); // handle exception
                return;
            }
            case err instanceof models.Forbidden: {
                console.error(err); // handle exception
                return;
            }
            case err instanceof models.NotFound: {
                console.error(err); // handle exception
                return;
            }
            case err instanceof models.TooManyRequests: {
                console.error(err); // handle exception
                return;
            }
            case err instanceof models.InternalServerError: {
                console.error(err); // handle exception
                return;
            }
            case err instanceof models.ServiceUnavailable: {
                console.error(err); // handle exception
                return;
            }
            default: {
                throw err;
            }
        }
    }

    for await (const event of result) {
        // Handle the event
    }
}

run();

Azure AI

Prerequisites

Before you begin, ensure you have AZUREAI_ENDPOINT and an AZURE_API_KEY. To obtain these, you will need to deploy Mistral on Azure AI. See instructions for deploying Mistral on Azure AI here.

Step 1: Install

Install @speakeasy-sdks/mistral-azure using npm:

npm install @speakeasy-sdks/mistral-azure

Step 2: Example Usage

Here's a basic example to get you started. You can also run the example in the examples directory.

import {
  MistralAzure,
  ChatCompletionRole,
} from "@speakeasy-sdks/mistral-azure";

const sdk = new MistralAzure({
  apiKey: "azureAPIKey",
  endpoint: "azureEndpoint",
});

const chatResult = await sdk.chat.create({
  messages: [
    {
      role: ChatCompletionRole.User,
      content: "What is the best French cheese ?",
    },
  ],
});

Google Cloud

Prerequisites

Before you begin, you will need to create a Google Cloud project and enable the Mistral API. To do this, follow the instructions here.

To run this locally you will also need to ensure you are authenticated with Google Cloud. You can do this by running

gcloud auth application-default login

Step 1: Install

Install @speakeasy-sdks/mistral-google-cloud using npm:

npm install @speakeasy-sdks/mistral-google-cloud

Step 2: Example Usage

Here's a basic example to get you started. You can also run the example in the examples directory.

import {
  MistralGoogleCloud,
  ChatCompletionRole,
} from "@speakeasy-sdks/mistral-google-cloud";

const sdk = new MistralGoogleCloud();

const chatResult = await sdk.chat.create({
  model: "mistral-small",
  modelVersion: "2402",
  messages: [
    {
      role: ChatCompletionRole.User,
      content: "What is the best French cheese ?",
    },
  ],
});

console.log("Success", chatResult);

Server Selection

Select Server by Name

You can override the default server globally by passing a server name to the server optional parameter when initializing the SDK client instance. The selected server will then be used as the default on the operations that use it. This table lists the names associated with the available servers:

NameServerVariables
prodhttps://api.mistral.ai/v1None
import { Mistral } from "@speakeasy-sdks/mistral";

const mistral = new Mistral({
    server: "prod",
    apiKeyAuth: "<YOUR_BEARER_TOKEN_HERE>",
});

async function run() {
    const result = await mistral.chat.stream({
        model: "mistral-small-latest",
        messages: [
            {
                role: "user",
                content: "Who is the best French painter? Answer in JSON.",
            },
        ],
        responseFormat: {
            type: "json_object",
        },
        maxTokens: 512,
        randomSeed: 1337,
    });

    for await (const event of result) {
        // Handle the event
    }
}

run();

Override Server URL Per-Client

The default server can also be overridden globally by passing a URL to the serverURL optional parameter when initializing the SDK client instance. For example:

import { Mistral } from "@speakeasy-sdks/mistral";

const mistral = new Mistral({
    serverURL: "https://api.mistral.ai/v1",
    apiKeyAuth: "<YOUR_BEARER_TOKEN_HERE>",
});

async function run() {
    const result = await mistral.chat.stream({
        model: "mistral-small-latest",
        messages: [
            {
                role: "user",
                content: "Who is the best French painter? Answer in JSON.",
            },
        ],
        responseFormat: {
            type: "json_object",
        },
        maxTokens: 512,
        randomSeed: 1337,
    });

    for await (const event of result) {
        // Handle the event
    }
}

run();

Custom HTTP Client

The TypeScript SDK makes API calls using an HTTPClient that wraps the native Fetch API. This client is a thin wrapper around fetch and provides the ability to attach hooks around the request lifecycle that can be used to modify the request or handle errors and response.

The HTTPClient constructor takes an optional fetcher argument that can be used to integrate a third-party HTTP client or when writing tests to mock out the HTTP client and feed in fixtures.

The following example shows how to use the "beforeRequest" hook to to add a custom header and a timeout to requests and how to use the "requestError" hook to log errors:

import { Mistral } from "@speakeasy-sdks/mistral";
import { HTTPClient } from "@speakeasy-sdks/mistral/lib/http";

const httpClient = new HTTPClient({
  // fetcher takes a function that has the same signature as native `fetch`.
  fetcher: (request) => {
    return fetch(request);
  }
});

httpClient.addHook("beforeRequest", (request) => {
  const nextRequest = new Request(request, {
    signal: request.signal || AbortSignal.timeout(5000)
  });

  nextRequest.headers.set("x-custom-header", "custom value");

  return nextRequest;
});

httpClient.addHook("requestError", (error, request) => {
  console.group("Request Error");
  console.log("Reason:", `${error}`);
  console.log("Endpoint:", `${request.method} ${request.url}`);
  console.groupEnd();
});

const sdk = new Mistral({ httpClient });

Authentication

Per-Client Security Schemes

This SDK supports the following security scheme globally:

NameTypeScheme
apiKeyAuthhttpHTTP Bearer

To authenticate with the API the apiKeyAuth parameter must be set when initializing the SDK client instance. For example:

import { Mistral } from "@speakeasy-sdks/mistral";

const mistral = new Mistral({
    apiKeyAuth: "<YOUR_BEARER_TOKEN_HERE>",
});

async function run() {
    const result = await mistral.chat.stream({
        model: "mistral-small-latest",
        messages: [
            {
                role: "user",
                content: "Who is the best French painter? Answer in JSON.",
            },
        ],
        responseFormat: {
            type: "json_object",
        },
        maxTokens: 512,
        randomSeed: 1337,
    });

    for await (const event of result) {
        // Handle the event
    }
}

run();

File uploads

Certain SDK methods accept files as part of a multi-part request. It is possible and typically recommended to upload files as a stream rather than reading the entire contents into memory. This avoids excessive memory consumption and potentially crashing with out-of-memory errors when working with very large files. The following example demonstrates how to attach a file stream to a request.

!TIP

Depending on your JavaScript runtime, there are convenient utilities that return a handle to a file without reading the entire contents into memory:

  • Node.js v20+: Since v20, Node.js comes with a native openAsBlob function in node:fs.
  • Bun: The native Bun.file function produces a file handle that can be used for streaming file uploads.
  • Browsers: All supported browsers return an instance to a File when reading the value from an <input type="file"> element.
  • Node.js v18: A file stream can be created using the fileFrom helper from fetch-blob/from.js.
import { Mistral } from "@speakeasy-sdks/mistral";
import { openAsBlob } from "node:fs";

const mistral = new Mistral({
    apiKeyAuth: "<YOUR_BEARER_TOKEN_HERE>",
});

async function run() {
    const result = await mistral.files.upload({
        file: await openAsBlob("./sample-file"),
    });

    // Handle the result
    console.log(result);
}

run();

Retries

Some of the endpoints in this SDK support retries. If you use the SDK without any configuration, it will fall back to the default retry strategy provided by the API. However, the default retry strategy can be overridden on a per-operation basis, or across the entire SDK.

To change the default retry strategy for a single API call, simply provide a retryConfig object to the call:

import { Mistral } from "@speakeasy-sdks/mistral";

const mistral = new Mistral({
    apiKeyAuth: "<YOUR_BEARER_TOKEN_HERE>",
});

async function run() {
    const result = await mistral.chat.stream(
        {
            model: "mistral-small-latest",
            messages: [
                {
                    role: "user",
                    content: "Who is the best French painter? Answer in JSON.",
                },
            ],
            responseFormat: {
                type: "json_object",
            },
            maxTokens: 512,
            randomSeed: 1337,
        },
        {
            retries: {
                strategy: "backoff",
                backoff: {
                    initialInterval: 1,
                    maxInterval: 50,
                    exponent: 1.1,
                    maxElapsedTime: 100,
                },
                retryConnectionErrors: false,
            },
        }
    );

    for await (const event of result) {
        // Handle the event
    }
}

run();

If you'd like to override the default retry strategy for all operations that support retries, you can provide a retryConfig at SDK initialization:

import { Mistral } from "@speakeasy-sdks/mistral";

const mistral = new Mistral({
    retryConfig: {
        strategy: "backoff",
        backoff: {
            initialInterval: 1,
            maxInterval: 50,
            exponent: 1.1,
            maxElapsedTime: 100,
        },
        retryConnectionErrors: false,
    },
    apiKeyAuth: "<YOUR_BEARER_TOKEN_HERE>",
});

async function run() {
    const result = await mistral.chat.stream({
        model: "mistral-small-latest",
        messages: [
            {
                role: "user",
                content: "Who is the best French painter? Answer in JSON.",
            },
        ],
        responseFormat: {
            type: "json_object",
        },
        maxTokens: 512,
        randomSeed: 1337,
    });

    for await (const event of result) {
        // Handle the event
    }
}

run();

Development

Maturity

This SDK is in beta, and there may be breaking changes between versions without a major version update. Therefore, we recommend pinning usage to a specific package version. This way, you can install the same version each time without breaking changes unless you are intentionally looking for the latest version.

Contributions

While we value open-source contributions to this SDK, this library is generated programmatically. Feel free to open a PR or a Github issue as a proof of concept and we'll do our best to include it in a future release!

SDK Created by Speakeasy

0.10.0

12 months ago

0.11.0

12 months ago

0.8.0

12 months ago

0.5.0

1 year ago

0.4.1

1 year ago

0.2.3

1 year ago

0.6.1

2 years ago

0.6.0

2 years ago