0.2.6 • Published 10 years ago

wintersmith-robots v0.2.6

Weekly downloads
32
License
-
Repository
github
Last release
10 years ago

wintersmith-robots

A Wintersmith plugin to generate a Robots.txt file for sitewide and per-page control over indexing.

Install

npm install wintersmith-robots

Add wintersmith-robots and wintersmith-contents to your config.json

{
  "plugins": [
    "wintersmith-contents",
    "wintersmith-robots"
  ]
}

Use

Set sitewide options in Wintersmith's config.json. If noindex is set globally, your entire site will be blocked from crawlers.

{
    "locals": {
        "sitemap": "sitemap.xml",
        "noindex": "false"
    }

}

Set per-page options at the top of your Markdown files. For instance, you can prevent an article from being indexed like so:

---
noindex: true
---
0.2.6

10 years ago

0.2.5

10 years ago

0.2.4

10 years ago

0.2.3

10 years ago

0.2.2

10 years ago

0.2.1

10 years ago

0.2.0

10 years ago

0.1.3

11 years ago

0.1.5

11 years ago

0.1.2

11 years ago

0.1.1

11 years ago

0.1.0

11 years ago