0.0.198 ā€¢ Published 8 days ago

codespin v0.0.198

Weekly downloads
-
License
MIT
Repository
github
Last release
8 days ago

codespin-cli

CodeSpin.AI Code Generation Tools. Open Source and MIT-licensed.

Installation

First, install Node.JS. Visit https://nodejs.org/en. You need Node 18 or above.

Then, install codespin using:

npm install -g codespin

Getting Help

To list all available commands:

codespin help

For specific help on a command, type codespin [command] help.

For instance:

codespin gen help # or codespin generate help

Also, check the Discord Channel.

Usage

Set the OPENAI_API_KEY (OR ANTHROPIC_API_KEY for Anthropic) environment variable. If you don't have an account, register at https://platform.openai.com/signup.

If you don't want to get an OPENAI_API_KEY, you may also use it with ChatGPT.

Ready to try? The following command generates code for a Hello World app and displays it:

codespin gen --prompt 'Make a python program (in main.py) that prints Hello, World!'

To save the generated code to a file, use the --write (or -w) option:

codespin gen --prompt 'Make a python program (in main.py) that prints Hello, World!' --write

Simple, right? That's just the beginning.

codespin init

Most features of codespin are available only once you initialize your project like this:

codespin init

This command creates a .codespin directory containing some default templates and configuration files. You may edit these templates are required, but the default template is fairly good.

In addition, it is also recommended to do a global init, which stores the config files under $HOME/.codespin. The global init also creates openai.json and anthropic.json under $HOME/.codespin where you can save your api keys.

codespin init --global

šŸ’”: It is recommeded to store the api keys globally, to avoid accidentally committing the api keys to git.

codespin generate

Use the codespin generate command to produce source code. You may also use the short alias codespin gen. The following examples will use codespin gen.

Generating a single code file

First, create a "prompt file" to describe the source code. A prompt file is simply a markdown file containing instructions on how to generate the code.

Let's start with something simple.

Create a file called main.py.md as follows:

out: main.py
include:

- main.py

Print Hello, World!
Include a shebang to make it directly executable.

Then generate the code by calling codespin:

codespin gen main.py.md --write

Alternatively, you could specify the file paths in CLI with --out (or -o shorthand) and --include (or -o) instead of using front-matter.

codespin gen main.py.md --out main.py --include main.py --write

Generating multiple files

This is just as simple. Here's an example of how you'd scaffold a new Node.JS blog app:

blogapp.md:

Create a Node.JS application for a blog.
Split the code into multiple files for maintainability.
Use ExpressJS. Use Postgres for the database.
Place database code in a different file (a database layer).

Include External Files and Declarations

For the code generator to better understand the context, you must pass the relevant external files (such as dependencies) with the --include (or -i) option.

For example, if main.py depends on dep1.py and dep2.py:

codespin gen main.py.md --out main.py --include main.py --include dep1.py --include dep2.py --write

But in some cases, including entire files (with --include or -i) will result in larger context sizes. To reduce the size of the context, you can send just the declarations/signatures found in a file with the --declare (or -d) option.

codespin gen main.py.md --out main.py --include main.py -d dep1.py -d dep2.py --write

Note that creating declarations will require a call to the LLM. Declarations are then cached until the file changes.

With both --include and --declare, you can specify wildcards. The following will include all ".py" files:

codespin gen main.py.md --out main.py -d "*.py" --write

You can also define the --include, --declare, --template, --parser, --model, and --max-tokens parameters in front-matter like this:

---
model: openai:gpt-3.5-turbo-16k
maxTokens: 8000
out: main.py
include:
  - dep1.py
  - dep2.py
---

Generate a Python CLI script named index.py that accepts arguments, calls calculate() function in dep1.py and prints their sum with print() in dep2.py.

In-place Includes in Prompt Files

It's quite a common requirement to mention a standard set of rules in all prompt files; such as mentioning coding convetions for a project. The include directive (codespin:include:<path>) let's you write common rules in a file, and include them in prompts as needed.

For example, if you had a ./conventions.txt file:

- Use snake_case for variables
- Generate extensive comments

You can include it like this:

Generate a Python CLI script named index.py that accepts arguments and prints their sum.

codespin:include:/conventions.txt

Spec Files

Spec files are another way to handle coding conventions and other instructions.

A "spec" is a template file containing a placeholder "{prompt}". The placeholder will be replaced by the prompt supplied (via the prompt file, or the --prompt argument).

For example, if you have the following spec called ./myrules.txt:

{prompt}

Rules:
- Use snake_case for variables
- Generate extensive comments

You can include it like this:

codespin gen main.py.md --spec myrules.txt

Executing code in Prompt Files

The exec directive executes a command and replaces the line with the output of the command. This powerful techique can be used to make your templates smarter.

For example, if you want to include the diff of a file in your prompt, you could do this:

codespin:exec:git diff HEAD~1 HEAD -- main.py

Regenerating code

The easiest way to regenerate code (for a single file) is by changing the original prompt to mention just the required modifications.

For example, if you originally had this in calculate_area.py.md:

Write a function named calculate_area(l, b) which returns l\*b.

You could rewrite it as:

Change the function calculate_area to take an additional parameter shape_type (as the first param), and return the correct caculations. The subsequent parameters are dimensions of the shape, and there could be one (for a circle) or more dimensions (for a multi-sided shape).

And run the gen command as usual:

codespin gen calculate_area.py.md --out calculate_area.py --include calculate_area.py -w

Sometimes you want to ignore the latest modifications while generating code, and use previously committed file contents. The include parameter (both as a CLI arg and in frontmatter) understands git revisions.

You can do that by specifying the version like this.

codespin gen calculate_area.py.md --out calculate_area.py --include HEAD:calculate_area.py -w

You can include diffs as well:

# Diff a file between two versions
codespin gen main.py.md --out main.py --include HEAD~2+HEAD:main.py -w

There are some convenient shortcuts.

# include HEAD:main.py
codespin gen main.py.md --out main.py --include :main.py -w

# diff between HEAD and Working Copy
codespin gen main.py.md --out main.py --include +:main.py -w

# diff between HEAD~2 and Working Copy
codespin gen main.py.md --out main.py --include HEAD~2+:main.py -w

This command above will ignore the latest edits to main.py and use content from git's HEAD.

Options for codespin gen

  • -c, --config <file path>: Path to a config directory (.codespin).
  • -e, --exec <script path>: Execute a command for each generated file.
  • -g, --go: Shorthand which sets the template to plain.mjs and parsing to false.
  • -i, --include <file path>: List of files to include in the prompt for additional context.
  • -o, --out <output file path>: Specify the output file name to generate.
  • -p, --prompt <some text>: Specify the prompt directly on the command line.
  • -t, --template <template name or path>: Path to the template file.
  • -w, --write: Write generated code to source file(s).
  • --debug: Enable debug mode. Prints debug messages for every step.
  • --declare <file path>: Specify declaration files for additional context. Repeat for multiple files.
  • --exclude <file path>: List of files to exclude from the prompt. Used to override automatically included source files.
  • --maxDeclare <count>: The maximum number of declaration files allowed. Defaults to 10.
  • --maxTokens: Maximum number of tokens for generated code.
  • --model <model name>: Name of the model to use, such as 'openai:gpt-4' or 'anthropic:claude-3-haiku-20240307'.
  • --outDir <dir path>: Path to directory relative to which files are generated. Defaults to the directory of the prompt file.
  • --parse: Whether the LLM response needs to be processed. Defaults to true. Use --no-parse to disable parsing.
  • --parser <path to js file>: Use a custom script to parse LLM response.
  • --pp, --printPrompt: Print the generated prompt to the screen. Does not call the API.
  • --spec: Specify a spec (prompt template) file.
  • --templateArgs <argument>: An argument passed to a custom template. Can pass many by repeating -a.
  • --writePrompt: Write the generated prompt out to the specified path. Does not call the API.
  • -h, --help: Display help.

Inline Prompting

As shown earlier, you can specify the prompt directly in the command line:

codespin gen --prompt 'Create a file main.py with a function to add two numbers.'

Remember to use --write to save the generated files.

Custom Templates

A CodeSpin Template is a JS file (an ES6 Module) exporting a default function with the following signature:

// The templating function that generates the LLM prompt.
export default function generate(args: TemplateArgs): TemplateResult {
  // Return the prompt to send to the LLM.
}

where TemplateResult and TemplateArgs is the following:

// Output of the template
export type TemplateResult = {
  // The generated prompt
  prompt: string;
  //Optional. Which type of parser should parse the response?
  responseParser?: "file-block" | "diff" | undefined; 
};

// Arguments to the templating function
export type TemplateArgs = {
  prompt: string;
  promptWithLineNumbers: string;
  include: VersionedFileInfo[];
  declare: BasicFileInfo[];
  outPath: string | undefined;
  promptSettings: unknown;
  templateArgs: string[] | undefined;
  workingDir: string;
};

export type BasicFileInfo = {
  path: string;
  contents: string;
};

export type VersionedFileInfo =
  | {
      path: string;
      type: "contents";
      contents: string;
      version: string;
    }
  | {
      path: string;
      type: "diff";
      diff: string;
      version1: string | undefined;
      version2: string | undefined;
    };

When generating code, specify custom templates with the --template (or -t) option:

codespin gen main.py.md --out main.py --template mypythontemplate.mjs --include main.py -w

šŸ’”: Your template should the extension mjs instead of js.

Once you do codespin init, you should be able to see example templates under the "codespin/templates" directory.

There are two ways to pass custom args to a custom template.

  1. frontMatter in a prompt file goes under args.promptSettings
---
model: openai:gpt-3.5-turbo-16k
maxTokens: 8000
useJDK: true //custom arg
out: main.py
---
  1. CLI args can be passed to the template with the -a (or --template-args), and they'll be available in args.templateArgs as a string array.
codespin gen main.py.md \
  --template mypythontemplate.mjs \
  -a useAWS \
  -a swagger \
  --out main.py \
  --include main.py \
  --write

Using with ChatGPT

While using codespin with an API key is straightforward, if you don't have one but have access to ChatGPT, there are alternatives.

Use the --pp (or --print-prompt) option to display the final LLM prompt, or --write-prompt to save it to a file:

# Display on screen
codespin gen something.py.md --print-prompt

# Or save to a file
codespin gen something.py.md --write-prompt /path/to/file.txt

Copy and paste the prompt into ChatGPT. Save ChatGPT's response in a file, e.g., gptresponse.txt.

Then, use the codespin parse command to parse the content:

# As always, use --write for writing to the disk
codespin parse gptresponse.txt --write

šŸ’”: When copying the response from ChatGPT, use the copy icon. Selecting text and copying doesn't retain formatting.

One more thing - Piping into the LLM!

Well, prompts can include data that was piped into codespin gen as well. :)

In your prompt, codespin:stdin will refer to whatever was passed to codespin.

For example, let's pipe the output of the ls command into codespin:

ls | codespin gen -p $'Convert to uppercase each line in the following text \ncodespin:stdin' -t plain.mjs --no-parse

The above example uses the included plain.mjs template along with the --no-parse option to print the LLM's response directly to the console. This is so handy there's shorthand for this: the -g option (g for Go).

# This
ls | codespin gen -p $'Convert to uppercase each line in the following text \ncodespin:stdin' -t plain.mjs --no-parse

# can be written as
ls | codespin gen -p $'Convert to uppercase each line in the following text \ncodespin:stdin' -g

Contributing

If you find more effective templates or prompts, please open a Pull Request.

0.0.197

8 days ago

0.0.196

8 days ago

0.0.195

9 days ago

0.0.198

8 days ago

0.0.194

15 days ago

0.0.193

15 days ago

0.0.192

15 days ago

0.0.191

15 days ago

0.0.190

15 days ago

0.0.186

15 days ago

0.0.189

15 days ago

0.0.188

15 days ago

0.0.187

15 days ago

0.0.185

16 days ago

0.0.184

16 days ago

0.0.183

16 days ago

0.0.182

17 days ago

0.0.169

24 days ago

0.0.164

24 days ago

0.0.163

24 days ago

0.0.168

24 days ago

0.0.167

24 days ago

0.0.166

24 days ago

0.0.165

24 days ago

0.0.175

24 days ago

0.0.174

24 days ago

0.0.173

24 days ago

0.0.172

24 days ago

0.0.179

23 days ago

0.0.178

24 days ago

0.0.177

24 days ago

0.0.176

24 days ago

0.0.171

24 days ago

0.0.170

24 days ago

0.0.181

23 days ago

0.0.180

23 days ago

0.0.159

25 days ago

0.0.158

25 days ago

0.0.153

26 days ago

0.0.152

26 days ago

0.0.151

26 days ago

0.0.150

26 days ago

0.0.157

26 days ago

0.0.156

26 days ago

0.0.155

26 days ago

0.0.161

25 days ago

0.0.160

25 days ago

0.0.149

26 days ago

0.0.148

26 days ago

0.0.147

26 days ago

0.0.146

26 days ago

0.0.142

28 days ago

0.0.141

28 days ago

0.0.140

28 days ago

0.0.145

28 days ago

0.0.144

28 days ago

0.0.143

28 days ago

0.0.139

29 days ago

0.0.138

29 days ago

0.0.137

29 days ago

0.0.136

29 days ago

0.0.135

1 month ago

0.0.134

1 month ago

0.0.133

1 month ago

0.0.132

1 month ago

0.0.128

1 month ago

0.0.127

1 month ago

0.0.126

1 month ago

0.0.125

1 month ago

0.0.129

1 month ago

0.0.124

1 month ago

0.0.131

1 month ago

0.0.130

1 month ago

0.0.123

1 month ago

0.0.122

1 month ago

0.0.121

1 month ago

0.0.119

1 month ago

0.0.120

1 month ago

0.0.118

1 month ago

0.0.106

1 month ago

0.0.109

1 month ago

0.0.108

1 month ago

0.0.107

1 month ago

0.0.117

1 month ago

0.0.116

1 month ago

0.0.115

1 month ago

0.0.114

1 month ago

0.0.113

1 month ago

0.0.112

1 month ago

0.0.111

1 month ago

0.0.110

1 month ago

0.0.105

1 month ago

0.0.104

1 month ago

0.0.103

1 month ago

0.0.102

1 month ago

0.0.101

1 month ago

0.0.100

1 month ago

0.0.99

1 month ago

0.0.98

1 month ago

0.0.95

2 months ago

0.0.96

2 months ago

0.0.97

2 months ago

0.0.92

2 months ago

0.0.93

2 months ago

0.0.94

2 months ago

0.0.91

2 months ago

0.0.89

2 months ago

0.0.90

2 months ago

0.0.88

4 months ago

0.0.87

6 months ago

0.0.86

6 months ago

0.0.85

6 months ago

0.0.84

6 months ago

0.0.83

6 months ago

0.0.82

6 months ago

0.0.81

6 months ago

0.0.80

6 months ago

0.0.79

6 months ago

0.0.78

6 months ago

0.0.77

6 months ago

0.0.76

7 months ago

0.0.75

7 months ago

0.0.74

7 months ago

0.0.73

7 months ago

0.0.72

7 months ago

0.0.71

7 months ago

0.0.70

7 months ago

0.0.69

7 months ago

0.0.68

7 months ago

0.0.67

7 months ago

0.0.66

7 months ago

0.0.65

7 months ago

0.0.64

7 months ago

0.0.63

7 months ago

0.0.62

7 months ago

0.0.61

7 months ago

0.0.60

7 months ago

0.0.59

7 months ago

0.0.58

7 months ago

0.0.57

7 months ago

0.0.56

7 months ago

0.0.55

7 months ago

0.0.54

7 months ago

0.0.53

7 months ago

0.0.52

7 months ago

0.0.51

7 months ago

0.0.50

7 months ago

0.0.49

7 months ago

0.0.48

7 months ago

0.0.47

7 months ago

0.0.46

7 months ago

0.0.45

7 months ago

0.0.44

7 months ago

0.0.43

7 months ago

0.0.42

7 months ago

0.0.41

7 months ago

0.0.40

7 months ago

0.0.39

7 months ago

0.0.38

7 months ago

0.0.37

7 months ago

0.0.36

7 months ago

0.0.35

7 months ago

0.0.34

7 months ago

0.0.33

7 months ago

0.0.32

7 months ago

0.0.31

7 months ago

0.0.30

7 months ago

0.0.29

7 months ago

0.0.28

7 months ago

0.0.27

7 months ago

0.0.26

7 months ago

0.0.25

7 months ago

0.0.24

7 months ago

0.0.23

7 months ago

0.0.22

7 months ago

0.0.21

7 months ago

0.0.20

7 months ago

0.0.19

7 months ago

0.0.18

7 months ago

0.0.17

7 months ago

0.0.16

7 months ago

0.0.15

7 months ago

0.0.14

7 months ago

0.0.13

7 months ago

0.0.12

7 months ago

0.0.11

7 months ago

0.0.10

7 months ago

0.0.9

7 months ago

0.0.8

7 months ago

0.0.6

7 months ago

0.0.5

7 months ago

0.0.4

7 months ago

0.0.3

7 months ago

0.0.2

7 months ago

0.0.1

7 months ago