crawler
Crawler is a ready-to-use web spider that works with proxies, asynchrony, rate limit, configurable request pools, jQuery, and HTTP/2 support.
Crawler is a ready-to-use web spider that works with proxies, asynchrony, rate limit, configurable request pools, jQuery, and HTTP/2 support.
Library to work against complex domain names, subdomains and URIs.
package renamed to <urijs>, please update dependencies!
URL rewrite middleware for koa
A Bit.ly API library for Node.JS
Resolve a URI relative to an optional base URI
Performant utilities for URL resolution and parsing built on core url.
URL key-value cache.
Download website to a local directory (including all css, images, js, etc.)
Node.js module to generate URL slugs. Another one? This one cares about i18n and transliterates non-Latin scripts to conform to the RFC3986 standard. Mostly API-compatible with similar modules.
Find broken links, missing images, etc in your HTML.
Returns GitHub repository URL based on package.json
TinyURL.com URL Shortener Node.js Module
Generic file download utility
Progress bar plugin for download
http://is.gd URL Shortener Node.js Module
Robustly checks an array of URLs for liveness.
A simple Vue directive to turn URL's and emails into clickable links
parse mongodb connection strings.
PostHTML plugin for transforming URLs.