Proposal talk:Wiki in a Stick, with auto-update

From Strategic Planning
Jump to navigation Jump to search

Impact?

Some proposals will have massive impact on end-users, including non-editors. Some will have minimal impact. What will be the impact of this proposal on our end-users? -- Philippe 00:00, 13 September 2009 (UTC)

Technological Capabilites

Wikipedia is a huge projects with millions of articles, many of which are updated on a daily basis. Having all that information on a single USB stick would require huge amounts of memory and the time it would take to download all the updates so that they'd be viewable without internet connection would be insane. This proposal, while entertaining, seems completely unrealistic.

And it would be to expensive for the foundation. The only possibility is to sell each month an update-file. --Goldzahn 04:16, 22 September 2009 (UTC)

Positive Arguments

Recent flash memory systems get cheaper and cheaper. 8 GB for 30 € ?! A couple of years ago, I got one with 128 MB for more than 100! The stick should only be in one language the user can choose before downloading. Also, a more simple format could be used, like extracting the images. To answer to a direct question to me, using the APACHE base is simple, but maybe not even necessary, the simple HTML version could be used.

Coming out in 2010 - sort of

Linterweb is producing a releases in 2010 of this sort - they have negotiated a major retail distribution already. The collections will be of two types:

  1. Simple dumps of the complete set of articles in a particular language, but (I believe) with limited or no graphical content. These will be very complete, and be quite up to date, but may contain vandalized versions articles.
  2. en:Version 0.7, which is the top 31,000 articles selected using exactly the criteria you mentioned. It includes all pictures etc, and it has been thoroughly checked for vandalism. The main problem is that the content is now quite old (late 2008). We hope to speed up the process of removing vandalism so that Version 0.8 can be produced much faster. We also need to work out how to do periodic updates. Look for it in a WalMart or WH Smiths near you in the next couple of years! Walkerma 17:26, 5 January 2010 (UTC)

Software strategy for offline changes

I have been thinking about a possible way to do it and below is the result.

--Edu4all 14:42, 10 July 2011 (UTC)

1. Distribution

Create "Wikipedia on a Stick" download release versions.

This could be done as a zipped static HTML version per language. Starting with languages that are understood in as much as possible regions with low internet coverage. English, Mandarin, Hindi, Arabic, Russian? Using some build script that has access to a snapshot of the database, to a live morror or at least to some wikipedia API (all this is much better than spidering). Huge graphics could be retrieved via http and added in a smaller version only, liking to the the high res online versions. If all articles is too much, articles could be selected by popularity, so articles with very low popularity would be accessible only online. The release cycle could be quarter-yearly, monthly or maybe weekly, depending on the cost.

2. Steps for the user

a) The user can download the zip file and extract it to her usb stick, mobile device or where she prefers.

b) Because it is a static html version, there is no need for a server, a local server or even a database. But it inlcudes some javascript for the local edits.

c) All external links are kept, and it is up to the user to deal with the situation when she is offline.

d) There are the normal "edit" buttons like in the online version of wikipedia. But it is only possible to edit a section. This minimizes the occurence of editing conflicts. They do the following:

3. Dealing with edits

a) If the user clicks on an edit button, the section she wants to edit is moved to a new html file (using javascript, it could be a "wiki on a stick" derivate or something similar). This file will hold the edits (maybe the edited version and a diff towards the original version. If this section was already edited locally, the edit will start at the last local version (and not start a new edit).

b) In LAN environments every user should be instructed to use their own copy for edits. Or maybe the edit script could handle different users.

b) So the user ends up with a buch of local "section edit" files on her local storage device. The user will have some html / javascript admin page for her local edits. There is a button "upload to wikipedia" which will work when the user is online and do the following:

4. Upload to wikipedia

a) The "upload to wikipedia" button will upload all local changes to a wikipedia server. All this will happen using a wikipedia/wikimedia user account of the user. All subsequent steps will run on the wikipedia server and no longer from the local javascript admin page:

b) All changes that can still be applied (because the base section was not changed in the meantime) will be applied like normal edits of the user. (They will receive some additional information in the version history to distinguish them from online edits.)

c) Changes that cannot be applied

The changes that cannot be applied, because the base section has changed in the meantime, will be handled as "conflicts". A special conflict handling strategy is necessary and this is the core problem of the whole procedure.

5. Conflict handling

One possible simple solution for the start:

a) The offline edit will be inserted into the official online timeline as 2 edits. The first commits inserts it into the timeline after the base edit. The second reverts it. This comes quite close to reality and it uses the available structure.

b) Then the 2 edits and the occured conflict should be marked somehow.

c) The user who made the offline edit could be offered to solve the conflict by herself. But also other editors of the page could do that. This is what is known commonly as a "3 way merge".