@theia/ai-llamafile v1.59.0
AI Llamafile Integration
The AI Llamafile package provides an integration that allows users to manage and interact with Llamafile language models within Theia IDE.
Features
- Start and stop Llamafile language servers.
Commands
Start Llamafile
- Command ID:
llamafile.start - Label:
Start Llamafile - Functionality: Allows you to start a Llamafile language server by selecting from a list of configured Llamafiles.
Stop Llamafile
- Command ID:
llamafile.stop - Label:
Stop Llamafile - Functionality: Allows you to stop a running Llamafile language server by selecting from a list of currently running Llamafiles.
Usage
Starting a Llamafile Language Server:
- Use the command palette to invoke
Start Llamafile. - A quick pick menu will appear with a list of configured Llamafiles.
- Select a Llamafile to start its language server.
- Use the command palette to invoke
Stopping a Llamafile Language Server:
- Use the command palette to invoke
Stop Llamafile. - A quick pick menu will display a list of currently running Llamafiles.
- Select a Llamafile to stop its language server.
- Use the command palette to invoke
Dependencies
This extension depends on the @theia/ai-core package for AI-related services and functionalities.
Configuration
Make sure to configure your Llamafiles properly within the preference settings. This setting is an array of objects, where each object defines a llamafile with a user-friendly name, the file uri, and the port to start the server on.
Example Configuration:
{
"ai-features.llamafile.llamafiles": [
{
"name": "MyLlamaFile",
"uri": "file:///path/to/my.llamafile",
"port": 30000
}
]
}11 months ago
11 months ago
12 months ago
12 months ago
12 months ago
12 months ago
12 months ago
1 year ago
1 year ago
12 months ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago