0.1.2 • Published 1 year ago
robots-util v0.1.2
Parse Robots
Parse robots.txt files
Parses the robots.txt file and other util.
Designed so that the sitemap plugin can write the sitemap URL to the robots.txt without string manipulation.
Install
npm instll robots-util
yarn add robots-utilAPI
RobotsLine
Represents a parsed line in a robots.txt file.
RobotsLine
new RobotsLine(line, index[, key][, value])Create a RobotsLine.
lineString the raw line input.indexNumber the zero-based line number.keyString the declaration key.valueString the declaration value.
.parse
RobotsLine.prototype.parse()Parse the line into this instance.
Returns this line instance.
.serialize
RobotsLine.prototype.serialize()Get a serialized line from the current state.
Returns a string line value.
RobotsParser
Parse and serialize a robots.txt file.
Designed so that the serialized output has a 1:1 relationship with the
source document but allows inspecting and modifying the key and value
properties for each line.
.parse
RobotsParser.prototype.parse(content)Parse the robots.txt file content.
User-Agent:
Disallow: /private/ # does not block indexing, add meta noindexBecomes:
[
{
key: 'User-Agent',
value: '*',
lineno: 1,
line: 'User-Agent: *'
},
{
key: 'Disallow',
value: '/private/',
lineno: 2,
line: 'Disallow: /private/ # does not block indexing, add meta noindex',
comment: '# does not block indexing, add meta noindex'
}
]Returns an array of line objects.
contentString the robots.txt file content.
.serialize
RobotsParser.prototype.serialize(list)Serialize the robots.txt declaration list.
Returns a string of robots.txt file content.
listArray the parsed robots.txt declaration list.
License
MIT