1.0.0 • Published 3 months ago
perplexity-cli v1.0.0
perplexity-cli
A command-line interface (CLI) app that allows you to run queries using Perplexity's API directly from your terminal.
Installation
Local Installation
# Clone the repository
git clone https://github.com/yourusername/perplexity-cli.git
cd perplexity-cli
# Install dependencies
npm install
# Link the CLI globally
npm link
Global Installation (once published)
npm install -g perplexity-cli
Or run directly without installation:
npx perplexity-cli
Setup
Before using the CLI, you need to set your Perplexity API key:
perplexity-cli set-key YOUR_API_KEY_HERE
You can get your API key from the Perplexity API Settings page.
Usage
Running Queries
perplexity-cli query "What is the capital of France?"
Stream the Response in Real-Time
perplexity-cli query "Explain quantum computing" --stream
Save the Response to a File
perplexity-cli query "Write a short story about AI" --output story.txt
Specify a Different Model
perplexity-cli query "Explain quantum computing" --model sonar-large
View Query History
perplexity-cli history
View Available Models
perplexity-cli models
View Your API Key (Masked)
perplexity-cli view-key
Clear Your API Key
perplexity-cli clear-key
Available Commands
set-key <key>
: Set your Perplexity API keyview-key
: View your currently set API key (masked)query <question>
: Send a query to the Perplexity APImodels
: List available Perplexity API modelshistory
: View history of recent queriesclear-key
: Clear your stored API key
Options for Query Command
-m, --model <model>
: Specify the model to use for your query (default: sonar)-s, --stream
: Stream the response as it is generated-o, --output <file>
: Save the response to a file-h, --help
: Display help information-V, --version
: Display version information
Models
The CLI supports various Perplexity models:
sonar
: Default, balanced speed and capabilitysonar-small
: Fastest, least capablesonar-medium
: Good balance of speed and capabilitysonar-large
: Most capable, slowercodellama-70b
: Specialized for code generationmistral-7b
: Open-source modelmixtral-8x7b
: Mixture of experts modelllama-3-70b
: Meta's latest model
License
MIT
1.0.0
3 months ago