qwen-ai-provider v0.1.0
qwen-ai-provider
Qwen AI Provider for interacting with Qwen models via Alibaba Cloud's Model Studio API.
Requirements
This provider requires a valid Qwen API key and adherence to the API specifications provided by Alibaba Cloud.
Installation
The qwen-ai-provider is available as an npm package. You can install it with:
npm i qwen-ai-provider
Provider Instance
You can import the default provider instance qwen
from qwen-ai-provider
:
import { qwen } from "qwen-ai-provider"
If you need a customized setup, you can import createQwen
from qwen-ai-provider
and create a provider instance with your settings:
import { createQwen } from "qwen-ai-provider"
const qwenProvider = createQwen({
// custom settings, e.g.:
apiKey: "YOUR_QWEN_API_KEY",
baseURL: "https://dashscope-intl.aliyuncs.com/compatible-mode/v1",
headers: { "Custom-Header": "value" },
})
The following optional settings can be provided:
baseURL string
Use a different URL prefix for API calls, e.g. to use proxy servers. The default is
https://dashscope-intl.aliyuncs.com/compatible-mode/v1
.apiKey string
Qwen API key, defaults to the
DASHSCOPE_API_KEY
environment variable.headers Record<string,string>
Custom headers to include in requests.
queryParams Record<string,string>
Optional URL query parameters for API calls.
fetch FetchFunction
Custom fetch implementation, useful for testing or middleware.
Models
The provider supports text generation/chat and embeddings. The first argument is the model id. For example:
const chatModel = qwen("qwen-plus")
Other model functions include:
qwen.chatModel(modelId, settings)
qwen.completion(modelId, settings)
qwen.textEmbeddingModel(modelId, settings)
qwen.languageModel(modelId, settings)
(alias forchatModel
)
Tested Models and Capabilities
This provider has been tested with Qwen models and supports a range of features:
Feature | Support |
---|---|
Text generation | :white_check_mark: |
Streaming output | :white_check_mark: (if supported) |
Object generation | :white_check_mark: |
Embedding generation | :white_check_mark: |
Usage Details
Chat Settings
The settings object allows you to fine-tune model behavior, including streaming, temperature control, and tool integration. See inline documentation in the source code for more details.
Tool Integration
Some Qwen models support tool usage for function calls. Configure the
tools
property in your settings as needed.Versioning and Packaging
When publishing packages (e.g. on GitHub Packages), the provider supports pre-release versioning and packaging via GitHub Actions.
Testing
This package includes a suite of tests using Vitest. To run tests:
npm run test
For more details, review the test files in the repository.