Proposal:Sponsor development of automated and assisted tools for combating vandalism

From Strategic Planning
Jump to: navigation, search
Status (see valid statuses)

The status of this proposal is:
Request for Discussion / Sign-Ups


If not English, in what language is this proposal submitted?:

Summary

Existing tools for combating vandalism on Wikimedia projects, and indeed, on any project relying on Mediawiki software have been developed haphazardly, and are having trouble coping with the scale of large projects.


Proposal

The foundation should take a more active role in the development of tools and infrastructure for detecting and combating vandalism on Wikimedia projects, with a secondary goal of more comprehensive examination of recent changes.

Tools that would work would need to be able to

  • Effectively divide up recent changes
  • Effectively filter edits that are from trusted users and which meet certain criteria.
  • Effectively record the extent to which each change has been reviewed and the confidence of the reviewer, so that no edit is without at least some scrutiny.
  • Report on articles that have uninspected or weakly examined changes with a design goal of getting each article to a clean, fully inspected state and keeping it that way.


Motivation

  • The Recent changes page on a very large project, such as the English Wikipedia has become effectively unworkable, as anywhere from 80 to 300 edits a minute are normally occurring. What was once (and still is on small Wikis) an effective tool to control quality is now nearly useless - there's simply too much happening for one person to follow, and there's no means to divide the workload other than controversial measures such as Pending changes.
  • Most tools for combating vandalism and monitoring article quality rely on freshness and/or watchlisting - if an unwanted change is not caught at the time it is made, it might be caught by someone with the article watchlisted in the next week, otherwise it could be there for far longer.
  • Buried vandalism has been used in several high-profile cases to damage the credibility of Wikimedia projects, particularly the English encyclopedia.
  • RC Patrollers using tools such as Huggle are rushed, and may be more apt to make mistakes - this is both due to a competitive culture that is fostered by the tools design, and the pressure of an diff queue that fills rapidly - without heavy use of filters, it's impossible to examine changes fast enough to keep up with the live RC.

Key Questions

  • How will backlogs be prevented?
  • Who gets access to these tools?
  • How do we make these tools collaborative rather than competitive?
  • How do we maximize coverage of changes?
  • At least on EN wiki the vandal fighting bots have been improving faster than the vandals can keep up. How would a change of strategy improve on this faster than the existing processes have been doing?

Potential Costs

  • Developer time at least comparable to that of a tool such as WP:HUGGLE
  • Infrastructure costs for underlying collaboration platform

References

Community Discussion

Do you have a thought about this proposal? A suggestion? Discuss this proposal by going to Proposal talk:Sponsor development of automated and assisted tools for combating vandalism.

Want to work on this proposal?

  1. .. Sign your name here!