@kakugo-ch/codex-batch-cli v1.1.1
Codex Batch CLI
A robust CLI tool for batch processing questions using OpenAI's CLI tools. Process multiple questions at once, handle failures gracefully, and generate comprehensive reports.
Features
- Batch Processing: Handle multiple questions in a single run
- Smart Retry: Automatically retry failed queries with
--retryflag - Continuous Processing: Automatically continues from where it left off if interrupted
- Parallel Processing: Process multiple questions concurrently with configurable concurrency
- Detailed Reports: Generate structured JSON reports with answers and errors
- Template Support: Define reusable templates with variables for common question patterns
- Error Handling: Robust error handling with configurable timeouts and retries
- Automatic Security: Sensitive data like API keys and tokens are automatically masked in logs and error messages
- Easy to Use: Simple CLI interface with clear commands
- Flexible CLI Support: Works with different OpenAI CLI tools (codex, open-codex, etc.)
Quick Start
Install your preferred OpenAI CLI tool:
# For example, install OpenAI's Codex CLI: npm install -g @openai/codex # Or install open-codex: npm install -g @openai/open-codexInstall this CLI:
npm install -g @kakugo-ch/codex-batch-cliSet your OpenAI API key:
export OPENAI_API_KEY="your_api_key_here"Create a
questions.jsonfile. You can use direct questions or templates:{ "templates": [ { "id": "code-review", "content": "Please review the following {{ language }} code:\n\n```{{ language }}\n{{ code }}\n```\n\nFocus on: {{ focus }}" } ], "questions": [ { "id": "q1", "template": "code-review", "variables": { "language": "javascript", "code": "function add(a,b) { return a+b }", "focus": "code style and type safety" } }, { "id": "q2", "text": "What is the language of this repository?" } ] }Run the CLI:
codex-batch -i questions.json -o report.json
Prerequisites
- Node.js >= 14.0.0
- OpenAI API key (set as
OPENAI_API_KEYenvironment variable) - OpenAI CLI tool (
npm install -g @openai/codexornpm install -g @openai/open-codex)
Commands
Process New Questions
codex-batch -i questions.json -o report.jsonRetry Failed Questions
codex-batch --retry report.jsonOptions
-i, --input <file>- Input questions file-o, --output <file>- Output report file (default: report.json)-r, --repo <path>- Path to the repository to analyze (default: current directory)--retry <file>- Retry failed questions from a report--max-retries <number>- Maximum retries for failed queries (default: 3)--timeout <number>- Timeout in milliseconds (default: 600000)--concurrency <number>- Number of questions to process in parallel (default: 1)--cli-args <args>- Additional arguments to pass to the CLI (e.g. "--temperature 0.7")--executable-name <name>- Name of the CLI executable to use (default: "codex")-h, --help- Show help
Input Format
The input file can contain both templates and direct questions:
Templates
Templates use Liquid syntax and can be defined in the templates array:
{
"templates": [
{
"id": "template-id",
"content": "Template content with {{ variable }} placeholders"
}
]
}Questions
Questions can either use a template or provide direct text:
{
"questions": [
{
"id": "question-1",
"template": "template-id",
"variables": {
"variable": "value"
}
},
{
"id": "question-2",
"text": "Direct question without template"
}
]
}Examples
Basic Usage
# Using default codex CLI sequentially
codex-batch -i questions.json -o report.json
# Using codex CLI with parallel processing (10 questions at a time)
codex-batch -i questions.json -o report.json --concurrency 10
# High concurrency for large batches (50 questions at a time)
codex-batch -i questions.json -o report.json --concurrency 50
# Using open-codex CLI
codex-batch -i questions.json -o report.json --executable-name open-codexIf the command is interrupted (e.g., by Ctrl+C), running it again with the same output file will: 1. Skip questions that were already successfully processed 2. Retry questions that failed 3. Process remaining questions that weren't attempted yet
Analyze Different Repository
codex-batch -i questions.json -r /path/to/repoWith Custom CLI Parameters
# Using codex with custom parameters
codex-batch -i questions.json --cli-args "--temperature 0.7 --max-tokens 100"
# Using open-codex with custom parameters
codex-batch -i questions.json --executable-name open-codex --cli-args "--temperature 0.7 --max-tokens 100"Retry Failed Questions with Different Parameters
codex-batch --retry report.json --cli-args "--temperature 0.9"Report Format
{
"timestamp": "2025-04-27T19:00:00.000Z",
"totalQuestions": 2,
"successfulQueries": 1,
"failedQueries": 1,
"answers": [
{
"questionId": "q1",
"question": "What is the language of this repository?",
"answer": "This repository is written in TypeScript.",
"startTime": "2025-04-27T18:59:58.123Z",
"endTime": "2025-04-27T19:00:00.000Z",
"durationMs": 1877
},
{
"questionId": "q2",
"question": "Is this repository using SQL databases?",
"error": "Timeout error",
"startTime": "2025-04-27T19:00:00.001Z",
"endTime": "2025-04-27T19:00:30.001Z",
"durationMs": 30000
}
],
"startTime": "2025-04-27T18:59:58.123Z",
"endTime": "2025-04-27T19:00:30.001Z",
"durationMs": 31878,
"fastestAnswerMs": 1877,
"slowestAnswerMs": 30000,
"averageAnswerMs": 15938
}Development
After making changes:
npm run build && npm test && npm linkRelease
npm run build && npm test && npm pack && npm publish