2.0.1 • Published 7 years ago

rfc-spider v2.0.1

Weekly downloads
3
License
GPL-3.0
Repository
github
Last release
7 years ago

Spider

This is a node module that allows you to create crawlers of websites.

It basically manages multiple requests, and concatenate the chunked data from node before passing to the provided parser. The parser can add more tasks based on the data parsed. The number of concurrent requests is limited to avoid overload.

Usage

This module contains a single class called Spider.

new Spider(tasks[, max = 50])

  • tasks: Array of {uri: string|object, parse, body}
  • parse: function(Buffer data, http.IncomingMessage response, http.ClientRequest request)
  • max: Maximum number of concurrent requests

If uri is a string, then it parsed with url.parse. The object is then passed to http.request. A default http.Agent and headers are used if not present in object. The body is passed to http.ClientRequest.end method, this can be useful for POST.

The parse function receives the concatenated data from multiple data events. It can add more tasks based on the data parsed. You can use the method toString to convert the data to a string.

See the example in the code for more details.

Install

npm install rfc-spider

License

The source code is licensed under GPL v3. License is available here.

2.0.1

7 years ago

2.0.0

8 years ago

1.1.1

8 years ago

1.1.0

8 years ago

1.0.0

8 years ago