0.2.6 • Published 9 years ago

wintersmith-robots v0.2.6

Weekly downloads
32
License
-
Repository
github
Last release
9 years ago

wintersmith-robots

A Wintersmith plugin to generate a Robots.txt file for sitewide and per-page control over indexing.

Install

npm install wintersmith-robots

Add wintersmith-robots and wintersmith-contents to your config.json

{
  "plugins": [
    "wintersmith-contents",
    "wintersmith-robots"
  ]
}

Use

Set sitewide options in Wintersmith's config.json. If noindex is set globally, your entire site will be blocked from crawlers.

{
    "locals": {
        "sitemap": "sitemap.xml",
        "noindex": "false"
    }

}

Set per-page options at the top of your Markdown files. For instance, you can prevent an article from being indexed like so:

---
noindex: true
---
0.2.6

9 years ago

0.2.5

9 years ago

0.2.4

9 years ago

0.2.3

9 years ago

0.2.2

9 years ago

0.2.1

9 years ago

0.2.0

9 years ago

0.1.3

9 years ago

0.1.5

9 years ago

0.1.2

9 years ago

0.1.1

9 years ago

0.1.0

10 years ago