makestatic-parse-robots v1.0.17
Parse Robots
Parse robots.txt files to an AST
Parses robots.txt files to an abstract syntax tree.
Designed so that the sitemap plugin can write the sitemap URL to the robots.txt without string manipulation.
Install
yarn add makestatic-parse-robotsAPI
ParseRobots
Parses robots.txt files to abstract syntax trees.
See Also
.sources
ParseRobots.prototype.sources(file, context)Parse a robots.txt file to an abstract syntax tree, the parsed AST is
assigned to file.ast.robots.
fileObject the current file.contextObject the processing context.
RobotsLine
Represents a parsed line in a robots.txt file.
RobotsLine
new RobotsLine(line, index[, key][, value])Create a RobotsLine.
lineString the raw line input.indexNumber the zero-based line number.keyString the declaration key.valueString the declaration value.
key
String keyThe declaration key.
value
String valueThe declaration value.
comment
readonly String commentA comment on this line.
line
readonly String lineThe raw input line.
lineno
readonly Number linenoThe line number using a one-based index.
.hasComment
RobotsLine.prototype.hasComment()Determine if this line has a commment.
Returns a boolean indicating if this line contains a comment.
.hasKeyPair
RobotsLine.prototype.hasKeyPair()Determine if this line has a valid key/value pair.
Returns a boolean indicating if this line contains a key and value.
.parse
RobotsLine.prototype.parse()Parse the line into this instance.
Returns this line instance.
.serialize
RobotsLine.prototype.serialize()Get a serialized line from the current state.
Returns a string line value.
RobotsParser
Parse and serialize a robots.txt file.
Designed so that the serialized output has a 1:1 relationship with the
source document but allows inspecting and modifying the key and value
properties for each line.
.parse
RobotsParser.prototype.parse(content)Parse the robots.txt file content.
User-Agent:
Disallow: /private/ # does not block indexing, add meta noindexBecomes:
[
{
key: 'User-Agent',
value: '*',
lineno: 1,
line: 'User-Agent: *'
},
{
key: 'Disallow',
value: '/private/',
lineno: 2,
line: 'Disallow: /private/ # does not block indexing, add meta noindex',
comment: '# does not block indexing, add meta noindex'
}
]Returns an array of line objects.
contentString the robots.txt file content.
.serialize
RobotsParser.prototype.serialize(list)Serialize the robots.txt declaration list.
Returns a string of robots.txt file content.
listArray the parsed robots.txt declaration list.
License
MIT
Created by mkdoc on March 12, 2017