Talk:Task force/Community Health/Survey

From Strategic Planning
Jump to: navigation, search
Information icon.svg
This page uses the LiquidThreads discussion system. Try discussing in sandbox. Click "Start a new discussion" to begin a new discussion on this page.

Contents

Thread titleRepliesLast modified
Pie charts808:26, 1 February 2010
open-ended questions - a gold mine of data223:18, 31 January 2010
how many edits did you make in a typical month010:16, 30 January 2010

Pie charts

Edited by another user.
Last edit: 22:57, 30 January 2010

The pie charts are misleading, the percentages do not add up to 100%. Please replace them.

Wasn't logged in: Paradoctor

85.178.210.24322:55, 30 January 2010

In most cases the pie charts represent the values (number of people answering) rather than the percentage...

~Philippe (WMF)22:57, 30 January 2010
 

Yeah, but the numbers don't add up, either! ^_^

Take question 5, it lists 2057 replies and 192.43%.

Paradoctor23:07, 30 January 2010

I'm going to redo that one - I suspect it's because people selected more than one answer, but I don't know for sure because I didn't administer the survey. In looking at it, though, I noticed that one of the values was left off, so I deleted the chart until I can re-do it. :)

~Philippe (WMF)23:10, 30 January 2010
Question total
Answers
total
percent
4 2452 209.36%
6 1069 103.92%
9 1558 145.75%
10 1560 145.93%
12 2071 193.73%

The rest is adding up to 1069 and 100%, respectively.

Paradoctor23:35, 30 January 2010

OK, I'm now looking at a copy of the survey - for Q4, Q5, Q9, and Q10 they were instructed to choose at most 3 answers. for Q12, they were instructed to select all that were true.


So, that explains the discrepencies, I think...

~Philippe (WMF)02:55, 31 January 2010
 

Yeah, for those multi-answer questions, a simple bar graph will be more appropriate.

Randomran15:14, 31 January 2010

Indeed, that's what I did on the re-upload of Q5.

~Philippe (WMF)16:25, 31 January 2010
 

Works for me, but I think you can get a nicer, more compact display if you simply use an additional column for the table, displaying horizontal bars there.

Paradoctor08:25, 1 February 2010
 
 
 
 
 

open-ended questions - a gold mine of data

The survey was designed with several open-ended questions that permitted editors to express themselves freely. This will be hard to quantify. But I suggest two things:

  • We have some person or group go through the actual surveys, and figure out a way to get a representative description of what people were actually talking about.
  • We use some kind of information parsing tool like this one to look for commonly used words. (Or phraselets, ideally.)

It will be important to parse the information first, though. Don't just want to jump in and read all the surveys. Want to look at editors who gave specific answers, to understand what those answers meant. Want to look at editors who had a specific number of edits, to see how new users and experienced users differed.

Randomran15:39, 31 January 2010

Yep, we're gathering those as well. We have to be careful how we use those, because we promised that they would be shared in the aggregate only, so figuring out how to handle that will be important, in order to keep our commitment to the respondents.

~Philippe (WMF)16:24, 31 January 2010
 

That's not too hard. Just pick a constant (e.g.: say, all the editors with more than 1000 edits, or all the editors who thought that complexity was a factor in why they left) and pick a random sample of 20 surveys. From those 20 surveys, aggregate their answers about their best/worst experience, and the "miscellaneous question" at the end. From there, we could easily look for common themes, as well as significant differences.

Randomran23:18, 31 January 2010
 

how many edits did you make in a typical month

Some time ago I looked into the User creation log [1] and the numbers there are similar to the ones this survey tells us. Only a small percentage of all logged-in users did more than a few edits. In this survey the numbers are not there, but the most new users don´t edit at all. The User creation log tells even more about our editors, if you look into the edits. One of the things I looked into, was how much time passed between creating a new user and the first edit of this user. You will find, that about the same number of editors start editing within 5 minutes or start editing after an hour and about half of them start editing after 24 hours. New editors are spending a lot of time at wikipedia. Maybe the User creation log could be a test, if the new vector-skin reduces the time new users spend before editing. --Goldzahn 10:16, 30 January 2010 (UTC)

Goldzahn10:16, 30 January 2010