Interviews/Trevor Parscal

From Strategic Planning

Digging into your role and background . . .

I started out a consultant 10 years ago and was working/learning underneath a usability design expert. I also branched out and did other stuff like political campaigns, end-to-end technology, etc. I ended up with a design and development background and was fortunate enough to find this opportunity. My work at Wikimedia is very similar development-wise to other experiences (e.g. writing code) but my design background is under-utilized.

When I started we had no designer at all. People noticed I could do that sort of thing, so over time I have been doing it more and more.

I was hired as a core developer in October 2008 and in Feb 2009 I was moved to the usability project. I’m the only permanent employee on that project – everyone else was only hired only until the end of the project. I am a software developer on that team, but also do a lot of design work as well – picking up where Parul leaves off (she's a designer too, but she's very research focused). The team underwent some changes because we struggled with finding an effective senior developer, and we eventually decided not to have one at all. That’s a theme in this organization actually – It’s really hard to find high-level people for tech positions because a lot of what we do is so different and particular.

What makes that so difficult?

The problem is mostly specific to MediaWiki. Like a lot of open source projects, in the beginning, everyone was basically contributing in small increments in their spare time and there was no one really thinking about the overall architecture - and in many ways that's very true still. So we have ended up with lots of bad designs, inconsistencies and dirty hacks. These first people involved in developing MediaWiki were basically amateur developers. These people were making decisions, some good, some bad. Many of the bad ones eventually created problems, most of which still exist today. If you had hired the more experienced developers up front, they wouldn’t have made most of those bad decisions to start with. The only way to fix this situation is to go through and do serious code cleaning and architectural renovation. Nobody really wants to fund that type of work because it isn’t “sexy”. It’s a hard sell, but long overdue.

If we had a clean and easy-to-use code base, it would be easier to attract developers. We were given $890K, in the form of a grant, to improve the user interface, especially for editing. Probably one-third of our development time went into fixing parts of MediaWiki to allow us to move forward, or working around the many difficulties that MediaWiki seems to arbitrarily present. We could have potentially completed a third more features in the same amount of time.

There needs to be more attention to seriously cleaning things up. But it’s very hard to find people who are qualified to do that who are also willing to accept the under-market compensation we currently offer. It seems like we interview and attempt to hire alot of people, but probably four-out-of-five end up declining the offer because of inadequate compensation. The result is that there are two types of people who end up accepting jobs here – those who are really passionate about open source and have arranged their lifestyle in a way that affords them the ability to take the job, or people who are actually worth the that much.

While, yes, the learning curve is quite steep, there are lots of brilliant people out there who could quickly be very productive here, but we don’t pay enough to attract or retain them.

Realistically, the level of burnout is going to continue to be a problem. I know we are a non-profit, but when I first started here, I often heard “We unlike any other non-profit. We are different.” I haven’t heard that in about a year. The culture seems to have shifted quickly towards being just like other non-profits.

If the code is simpler, the contributions will be easier for staff members to review and the ratio of volunteers to staff members could be higher. There are actually very few volunteers who are contributing to the core code on a regular basis. Most are working on extensions that are relevant to their own personal needs. The number of volunteers is shrinking because many people get too busy and leave, while many of those remaining are unfriendly to newcomers.

We have developers in our community who are very helpful and write great code, but not all of them have great social skills and some of them are territorial. It’s also very much like editing Wikipedia, where more established editors have a “you have to play by the rules” mentality, but the number of rules has grown substantially since they themselves made their first edit.

What could change that dynamic?

First off, meet-ups would be a good start. We did have a conference in Berlin just for developers. I went, and met all these developers who had written code that hadn’t been integrated yet because they were talking to the wrong people. We should probably have those like twice a year.

Additionally, better communication and being more inviting would help tremendously. We don’t have anyone on staff paying attention to the website for our software ( Other websites like suck you in and make it really easy to get started. And it works – they have a huge base of developers.

We have a very broken process for getting people access to contribute code. We have a hodge-podge of tools that aren’t linked together very well, or at all in some cases. There's a general failure to see that volunteers are like customers and that we need to value them and market to them to make it easy for them to get involved.

We did just hire a code maintenance engineer who is trying to be the buffer between people writing the code and to help do code review, but we probably should have hired a CTO first and given them the ability to hire this position and also provide guidance to them. Community management has been totally overlooked.

Usability initiative

Basically MediaWiki provides a horrible user experience – It's ugly and difficult to use. The projects we run on MediaWiki seem to only attract and retain people who are willing to endure pain to learn the tools. Often however, they then claim seniority/importance/superiority because they have become "experts" – but they are really just the only ones willing to put up with such horrible tools.

Also, open source communities suck at design work.

Successes – we’ve been able to staff up, despite a lot of challenges, in time to be able to save the project. We put out a skin called Vector and a couple of editing tweaks and about 85% of people who opted in have stayed. Liz Stanton seems to be thrilled with our use of her resources.

Failures – the fact that we didn’t do this earlier and looked to a grant to do it. This work is so crucial to Wikipedia moving forward that I feel it's ridiculous that we both waited so long to do it and also didn’t think it was important enough to allocate internal resources for it. Grants come with strings attached and lots of time wasted on reporting, etc. The grant says what features we have to create, regardless of what the usability surveys indicate we should do.

Features we would have chosen – making the edit and view pages the same thing - essentially in-place WYSIWIG editing. The problem is, that takes a couple of years to do and the grant we are working through only gives us one year and not enough people to pull that off. So we have created something that, while better than the current user experience, isn’t what the user really needs.

The grant use such specific language that it locks us into doing specific things that are resource intensive instead of focusing on low-hanging fruit.

What is most important for us to have?

Long term investments – WYSIWIG editing is the best long term investment we could make. The other is real-time collaboration. Being able to see everyone who is editing at the same time. This technology exists, and we should be doing it (e.g. Google Wave). For real-time collaboration we would need more powerful servers and a lot more of them. Real-time collaboration would bring so much into view – but climbing the barrier of participation would still require learning wikitext. So WYSIWIG is probably more likely to increase the number of valuable edits by presenting a lower barrier for new editors.

How long would it take the team you have to develop a WYSIWIG editor?

We’ve been deliberately taking steps to facilitate that in the future. After we finish what we are doing, it would probably take another two years. We could do it in half the time if we doubled capacity and got the people we need.

Low-hanging fruit – Our search results are displayed much better now, but the actual results are still horrible. Google does a much better job of getting you where you want to be in as few clicks as possible. All we have to do is improve our search engine. How many search engines exist out there that are better than what we have? I think there are many. We're talking about common technology here, not flying cars.

The quality of our search results is hindered by the fact that most content is not very well categorized, but probably more significantly by the fact that we aren't using a very good search engine. To make matters worse, there are features that we don’t have enough money to turn on because we don’t have the bandwidth, processing power or memory.

We have a data center in Florida and caching center in Amsterdam. We could have way more caching centers for free through donations if we could come up with a way to set them up and keep them running remotely, and let them come on and offline freely without causing any problems. But even more importantly, if Florida goes down, we are down.

We need redundancy and each server cluster needs to be much more capable. Advanced features make caching more difficult because it’s providing users with a more customized experience.

In short, if participation were to increase by some significant factor, our servers would go down.

Social networking features

Volunteer developers could, and often do write extensions that add these kinds of features. The problem is that most of this kind of communication involves a high volume of unique requests to our servers, and we simply don’t have the capacity for that. Performance would be a huge barrier. If we did add these kinds of features, we would need a lot of involvement from the Foundation to get around the operations constraints. In-browser chat, for instance, is really resource intensive to run. This is true of nearly all web 2.0 type technologies.

Do we have to run all of our own data centers? Twitter doesn’t. I sense a strong attitude that we need to, but we could probably be running at least some of our services on something cheaper and more reliable, and possibly free in the case of a donation-in-kind. However, to my knowledge, that conversation doesn’t seem be taking place.