makestatic-parse-robots v1.0.17
Parse Robots
Parse robots.txt files to an AST
Parses robots.txt files to an abstract syntax tree.
Designed so that the sitemap plugin can write the sitemap URL to the robots.txt without string manipulation.
Install
yarn add makestatic-parse-robots
API
ParseRobots
Parses robots.txt files to abstract syntax trees.
See Also
.sources
ParseRobots.prototype.sources(file, context)
Parse a robots.txt file to an abstract syntax tree, the parsed AST is
assigned to file.ast.robots
.
file
Object the current file.context
Object the processing context.
RobotsLine
Represents a parsed line in a robots.txt file.
RobotsLine
new RobotsLine(line, index[, key][, value])
Create a RobotsLine.
line
String the raw line input.index
Number the zero-based line number.key
String the declaration key.value
String the declaration value.
key
String key
The declaration key.
value
String value
The declaration value.
comment
readonly String comment
A comment on this line.
line
readonly String line
The raw input line.
lineno
readonly Number lineno
The line number using a one-based index.
.hasComment
RobotsLine.prototype.hasComment()
Determine if this line has a commment.
Returns a boolean indicating if this line contains a comment.
.hasKeyPair
RobotsLine.prototype.hasKeyPair()
Determine if this line has a valid key/value pair.
Returns a boolean indicating if this line contains a key and value.
.parse
RobotsLine.prototype.parse()
Parse the line into this instance.
Returns this line instance.
.serialize
RobotsLine.prototype.serialize()
Get a serialized line from the current state.
Returns a string line value.
RobotsParser
Parse and serialize a robots.txt file.
Designed so that the serialized output has a 1:1 relationship with the
source document but allows inspecting and modifying the key
and value
properties for each line.
.parse
RobotsParser.prototype.parse(content)
Parse the robots.txt file content.
User-Agent:
Disallow: /private/ # does not block indexing, add meta noindex
Becomes:
[
{
key: 'User-Agent',
value: '*',
lineno: 1,
line: 'User-Agent: *'
},
{
key: 'Disallow',
value: '/private/',
lineno: 2,
line: 'Disallow: /private/ # does not block indexing, add meta noindex',
comment: '# does not block indexing, add meta noindex'
}
]
Returns an array of line objects.
content
String the robots.txt file content.
.serialize
RobotsParser.prototype.serialize(list)
Serialize the robots.txt declaration list.
Returns a string of robots.txt file content.
list
Array the parsed robots.txt declaration list.
License
MIT
Created by mkdoc on March 12, 2017