"Community Health" measures
Dear Dr. Simple is Good,
Absolutely. But in the meantime.... MOAR DATA PLZ!
Philippe actually nailed it, in my books. I don't think we've quite yet gotten to the bottom of what *really* affects community health, so we really need to track a lot of different numbers.
Satisfaction is probably more accurate than happiness. But the survey actually showed that some of the most dissatisfied editors are actually our most engaged. Not sure why that is. Maybe it's pathological OCD. Maybe it's that more engaged editors are willing to accept higher levels of BS. But the real point: we're not going to be able to measure health by checking the population for satisfaction and growth. Dissatisfaction and stagnation are the symptom, not the diagnosis.
The survey focused on newer editors... and it actually seems there would be three killer stats to look at.
- One would be how many editors abandon edits without pushing save. (Obviously some amount is normal. But like Sue noticed, it would be a huge sign of improvement to get from 50% abandonment down to 40%, or what not.) That would measure how easy and convenient people are finding it to edit.
- Two would be how many edits are reverted. Again, some amount is normal. But a revert shows a problem on two ends. On one hand, it shows a community that is hostile to change. On the other hand, it shows an influx of editors who may be making inappropriate changes that upset community norms. Whose fault is it -- the reverter or the revertee? It almost doesn't matter. As much as reverting is natural, we know that too much is a bad sign.
- Three would be activity at dispute resolution pages. Drama usually goes there. Drama will probably grow with the population (more people means more disputes), but if it's growing faster than the population then we have a problem with community health.
It would also be REALLY useful if we could slice up the dataset. Imagine that we could find out that we're getting more editors, but only for articles about music! Then we could look at other stats around music articles, and figure out what's helping that music sub-community grow while other parts of Wikipedia are stagnating. Numbers are just data. But when you can compare them to something, you can understand what the heck is really going on.
I love the idea of segmentation by article type, Randomran. I'm going to continue to think on that a little. Really great idea.