0.0.3 β’ Published 1 year ago
spellbook-forge v0.0.3
πͺππ¨ Spellbook Forge
Make your LLM prompts executable and version controlled.
Quick Start
In your Express server:
yarn add spellbook-forge
import { spellbookForge } from "spellbook-forge";
const app = express()
.use(spellbookForge({
gitHost: 'https://github.com'
}))
and then:
http://localhost:3000/your/repository/prompt?execute
<-- HTTP 200
{
"prompt-content": "Complete this phrase in codersβ language: Hello β¦",
"model": "gpt3.5",
"result": "Hello, World!"
}
π€ What is this?
This is an ExpressJS middleware that allows you to create an API interface for your LLM prompts. It will automatically generate a server for your prompts stored in a git repository. Using Spellbook, you can:
- Store & manage LLM prompts in a familiar tool: a git repository
- Execute prompts with chosen model and get results using a simple API
- Perform basic CRUD operations on prompts
Note: It's an early version. Expect bugs, breaking changes and poor performance.
π Documentation
Full documentation coming soon!
Dependencies
Prompt format
Prompts must adhere to a specific format (JSON/YAML). See more info here
Example
βββ prompt1
β βββ prompt.json
β βββ readme.md
βββ collection
βββ prompt2
βββ prompt.yaml
βββ readme.md
The above file structure will result in the following API endpoints being generated:
{host}/prompt1
{host}/collection/prompt2
Files
prompt.json
the main file with the prompt content and configuration.readme.md
additional information about prompt usage, examples etc.