|It has been suggested that this page be merged with Proposal:Distributed Wikipedia. (Discuss)|
Make a peer-to-peer network to store wikipedia data.
A simple P2P software is used to browse, read and edit wikipedia pages: every user has a cache, where data are downloaded. If you search for a page, the program will first look if it's available in the cache (and will show it), otherwise it will search on the P2P network and download it (of course, the wikimedia server is also part of the P2P network); then, the program will search (and add to the local cache) all updates. If you modify a page, changes will be added (as an update) to the page stored in the local cache, then (if you are online) uploaded to the wikimedia server. Of course, other peers can download data from your cache.
In that way, you will never have to download anything that you don't need, however your computer will store one part of wikipedia; the server will have less work to do (you can download parts of wikipedia from other peers), if something happen to central server wikipedia will still be available, and also users that have a LAN access through a NAT will gain a big advantage: if internet access is slow, and many of them need the same article, it will be downloaded only one time.
It will increase accessibility, and improve Wikipedia long-term survival odds (many copies, in different places, will be available).
What P2P protocol should be used (torrent? eDonkey?)?
Can this software be done?
How much will it cost?
Do you have a thought about this proposal? A suggestion? Discuss this proposal by going to Proposal talk:P2P storage.
Want to work on this proposal?
- .. Sign your name here!