metalsmith-robots v1.1.0
metalsmith-robots
A metalsmith plugin for generating a robots.txt file
This plugin allows you to generate a robots.txt file. It accepts global options, and can be triggered from a file's frontmatter with the public and private keywords. Works well with metalsmith-mapsite, as that also accepts setting a page to private from the frontmatter.
For support questions please use stack overflow or the metalsmith slack channel.
Installation
$ npm install metalsmith-robotsExample
Configuration in metalsmith.json:
{
"plugins": {
"metalsmith-robots": {
"useragent": "googlebot",
"allow": ["index.html", "about.html"],
"disallow": ["404.html"],
"sitemap": "https://www.site.com/sitemap.xml"
}
}
}Which will generate the following robots.txt:
User-agent: googlebot
Allow: index.html
Allow: about.html
Disallow: 404.html
Sitemap: https://www.site.com/sitemap.xmlOptions
You can pass options to metalsmith-robots with the Javascript API or CLI. The options are:
useragent: the useragent - String, default:*allow: an array of the url(s) to allow - Array of Stringsdisallow: an array of the url(s) to disallow - Array of Stringssitemap: the sitemap url - StringurlMangle: mangle paths inallowanddisallow- Function
Besides these options, settings public: true or private: true in a file's frontmatter will add that page to the allow or disallow option respectively. metalsmith-robots expects at least one of the last three options, without them it will not generate a robots.txt.
urlMangle
To make sure paths start with a / you can mangle urls that are provided via allow and disallow.
.use(robots({
urlMangle: (filepath) => {
return (filepath.slice(0, 1) !== '/') ? `/${filepath}` : filepath;
}
}))License
MIT