1.4.3 • Published 5 months ago

@aurthle/seo-cli v1.4.3

Weekly downloads
-
License
MIT
Repository
github
Last release
5 months ago

Lobe SEO is a workflow tool that automates SEO Matter using ChatGPT.

English ・ 简体中文Changelog · Report Bug · Request Feature

TOC

✨ Features

  • 🤖 Automate SEO Matter using ChatGPT
  • ♻️ Support incremental SEO updates, automatically adding content for missing SEO information
  • 🛠️ Support custom OpenAI models, API proxies, temperature
  • 📝 Support Markdown Mdx SEO automation

📦 Installation

To install Lobe SEO, run the following command:

npm install -g @lobehub/seo-cli

[!IMPORTANT]\ Make sure your environment has Node.js version >= 18

🤯 Usage

To initialize Lobe i8n configuration, run the following command:

$ lobe-seo -o # or use the full flag --option

[!IMPORTANT]\ To use AI auto-generation, you need to fill in the OpenAI token in the settings

# Translate Locale files
$ lobe-seo

# Specify a configuration file
$ lobe-seo -c './custom-config.js' # or use the full flag --config

Configuration

You can choose any configuration method in cosmiconfig format

  • seo property in package.json
  • .seorc file in JSON or YAML format
  • .seorc.json, .seorc.yaml, .seorc.yml, .seorc.js, .seorc.cjs files

[!TIP]

This project provides a defineConfig secure definition method that can be imported from @lobehub/seo-cli

Environment Variables

Some additional configurations are provided in this project, set using environment variables:

Environment VariableTypeDescriptionExample
OPENAI_API_KEYRequiredThis is the API key you obtained from the OpenAI account pagesk-xxxxxx...xxxxxx
OPENAI_PROXY_URLOptionalIf you manually configure an OpenAI API proxy, you can use this setting to override the default OpenAI API request base URLhttps://api.chatanywhere.cn/v1Default:https://api.openai.com/v1

🔍 Configuration

Property NameRequiredTypeDefault ValueDescription
entry*string-Entry file or folder
entryExtensionstring.mdxEntry file extension
groupKeystring-Set group key for SEO matters
tagStringifybooleanfalseStringify the tags array
modelNamestringgpt-3.5-turboModel used
temperaturenumber0Sampling temperature used
referencestring-Provide some rule for more accurate seo
concurrencynumber5Number of concurrently pending promises returned
experimentalexperimental{}Experimental features, see below

experimental

Property NameRequiredTypeDefault ValueDescription
jsonModebooleanfalseEnable GPT forced JSON output for stability improvement (only supported by new models after November 2023)

Example 1 .seorc.js

const { defineConfig } = require('@lobehub/seo-cli');

module.exports = defineConfig({
  entry: './docs/**/*.mdx',
  modelName: 'gpt-3.5-turbo-1106',
  experimental: {
    jsonMode: true,
  },
});

Example 2 .seorc.json

{
  "entry": "./docs/**/*.mdx",
  "experimental": {
    "jsonMode": true
  },
  "modelName": "gpt-3.5-turbo-1106"
}

Example 3 package.json

{
  "...": "...",
  "seo": {
    "entry": "./docs/**/*.mdx",
    "modelName": "gpt-3.5-turbo-1106",
    "experimental": {
      "jsonMode": true
    }
  }
}

Running

Automatically generate SEO files using the lobe-seo command:

$ lobe-seo

⌨️ Local Development

You can use Github Codespaces for online development:

Alternatively, you can clone the repository and run the following commands for local development:

$ git clone https://github.com/lobehub/lobe-cli-toolbox.git
$ cd lobe-cli-toolbox
$ bun install
$ cd packages/lobe-seo
$ bun dev

🤝 Contributing

We welcome contributions in all forms. If you are interested in contributing code, you can check out our GitHub Issues, showcase your creativity, and share your ideas with us.

🔗 Links

More Products

  • 🤖 Lobe Chat - An open-source, extensible (Function Calling), high-performance chatbot framework. It supports one-click free deployment of your private ChatGPT/LLM web application.
  • 🤯 Lobe Theme - The modern theme for stable diffusion webui, exquisite interface design, highly customizable UI, and efficiency boosting features.

Credits


📝 License

Copyright © 2023 LobeHub. This project is licensed under MIT.