build-wikipedia-feed v0.1.2
build-wikipedia-feed
Build a hyperdb feed of Wikipedia articles, including historical revisions.
Rationale
Problem
Wikipedia is an incredibly important collection of knowledge on the internet. It is free to read and edit for everyone. Though, it is important to know that it is stored on only a handful of servers in a handful of countries controlled by a single organisation. This mainly causes two problems:
- Currently, it is too easy to censor the Wikipedia. We need a system that supports redundancy without any additional effort.
- It does not work offline. Making an offline copy is complicated. Also, you usually have to download all articles for a language.
Solution
Let's store Wikipedia's content in a peer-to-peer (P2P) system. By leveraging software from the Dat project, we won't have to reinvent the wheel. The Dat protocol efficiently only syncs changes between to versions of data, allows for sparse & live replication and is completely distributed.
This tool can extract articles from a Wikipedia dump or download it directly, and store it in a Dat archive. See below for more details.
Installing
npm install -g build-wikipedia-feedUsage
This module exposes several command line building blocks.
read all revisions of every article
Pipe a stub-meta-history* XML file into wiki-revisions-list. You will get an ndjson list of page revisions.
curl -s 'https://dumps.wikimedia.org/enwiki/20181001/enwiki-20181001-stub-meta-history1.xml.gz' | gunzip | wiki-revisions-list >revisions.ndjsonread the most recent revision of every article
Pipe a stub-meta-current* XML file file into wiki-revisions-list. You will get an ndjson list of page revisions.
curl -s 'https://dumps.wikimedia.org/enwiki/20181001/enwiki-20181001-stub-meta-current1.xml.gz' | gunzip | wiki-revisions-list >revisions.ndjsonread articles being edited right now
Use wiki-live-revisions. You will get an ndjson list of page revisions.
wiki-live-revisions >revisions.ndjsonfetch & store revisions in a hyperdb
Use wiki-store-revisions to write the HTML content of all revisions in revisions.ndjson into a hyperdb. The archive will be created under p2p-wiki in your system's data directory.
cat revisions.ndjson | wiki-store-revisionsRelated
- distributed-wikipedia-mirror – Putting wikipedia on IPFS
fetch-wikipedia-page-revision– Fetch a revision of a Wikipedia page as mobile HTML.wikipedia-edits-stream– A live stream of page edits on Wikipedia.commons-photo-url– Download Wikimedia Commons photos.wiki-article-name-encoding– Encode & decode Wiki(pedia) article names/slugs.
Contributing
If you have a question or have difficulties using build-wikipedia-feed, please double-check your code and setup first. If you think you have found a bug or want to propose a feature, refer to the issues page.