Jump to content

Proposal talk:Get another CMS for Commons

From Strategic Planning
Latest comment: 14 years ago by Yug in topic Free CMS

Impact?

Some proposals will have massive impact on end-users, including non-editors. Some will have minimal impact. What will be the impact of this proposal on our end-users? -- Philippe 00:00, 13 September 2009 (UTC)Reply

The main impact would be improved usability for Commons users. Searching images and locating similar images will be facilitated. Image URLs should ideally stay as they are now or at least have redirects from the old URL to the new one so existing links will not break. Regards, --ChrisiPK 21:02, 13 September 2009 (UTC)Reply

Don't like

I often cut and paste content between other Wikimedia projects and Commons (particularly various types of links, which I don't believe any other CMS is going to support) and I'd be pretty concerned about compatibility on that front. Not to mention just the basic compatibility of some non-Mediawiki software being able to mesh with all of the Mediawiki sites and properly serve content to them.

Also, as a web software engineer I don't agree with, nor can I even figure out where you're coming from, the statement that web applications should not be javascript-based. That is completely contradictory to all of the trends I see; take Gmail or most other Google apps, for example. The Ajax-based pre-upload preview on Commons is a rather late appearance of a heavily js-based feature.

But even besides all that I don't even see why you think that the functionality of Commons is primarily js-based; it seems to me that it probably would contain far less js code than the other sites mentioned like Flickr or Picasa. --Struthious Bandersnatch 05:08, 21 September 2009 (UTC)Reply

I'm not saying that we should not use JS at all, I'm saying that we should use it to do the stuff that really needs to be done on the client side. Right now JS is the only way people can customize a MediaWiki installation, which is why loads of JS is loaded where the server should be doing the job. Take the upload form, for example: There is a really complicated JS script transforming the upload form and checking what you input. This could be realized way better by simple changing the HTML code of the upload form and adjusting it to our needs. This is, however, not possible with the MediaWiki software. That is why we need a special software that is adjusted to the special needs of image hosting. I don't oppose using JS in this software, as long as it is really used for tasks that need to be done on the client side. Regards, --ChrisiPK 18:59, 21 September 2009 (UTC)Reply
About the MediaWiki compatibility: If I remember correctly, MediaWiki uses the API to get content from the shared repository. It should not be a problem to implement this interface and outputting stuff in MediaWiki syntax. It might even be worth thinking about whether we should use MediaWiki syntax in this new CMS, so copy and paste should generally work. Regards, --ChrisiPK 19:01, 21 September 2009 (UTC)Reply
I am not convinced that client side scripting like JavaScript is something that should be avoided and replaced by the code running on a server. Should not be right that client side code reduces the load on a server that does not need to run it? The situation when Commons serve more JavaScript than images is very strange as the scripts are largely the same then moving from image to image and should be cached on a browser. If it is really so, they are likely freshly downloaded every time that may indicate the bad content of caching-related HTTP headers or overall design on script placement in general. Or maybe statistics does not differentiate between HEAD and GET? Audriusa 09:07, 26 September 2009 (UTC)Reply
The entire idea of a server is that the server does the parsing and the client does the rendering. If we wanted clients to do the actual work, we could rely much more on JavaScript, that would reduce the load on the server much more. The problem is that JavaScript was not designed to do the major part of website creation. This is why JavaScript should be used for tasks which need to be done on the client side, and only for those. I am currently browsing Commons with a Pentium 4 machine and I am assuming that lots of our users have similar setups, as not everyone has a high-end machine in the office or at home. Waiting for JavaScript to finish is very tedious as you can basically not do anything before that has happened. Some scripts take several seconds to finish execution on my machine. This is, of course, sometimes also due to poorly coded scripts, but we wouldn't need people messing with JavaScript when we could do the parsing on the server. Multilanguage support is one of the best examples: Right now, the sever sends translations for all languages and then a JavaScript iterates over them and removes the ones that are not the one we want to see. This is just inefficient, we could just have the server select the language and then send only that to the user. If the user wants to view a different language, we could still use AJAX to load that. Another example is the upload form: All validation of the upload form input on Commons is done via JavaScript. So if you want to upload a file without a license template, just disable JavaScript in your browser. If we would do the validation on the server side, it would not be so easy to circumvent it. Regards, --ChrisiPK 15:56, 26 September 2009 (UTC)Reply
What you are saying simply is not true. You're describing mainframes and terminals, not real modern computing. SSJS (Server-Side Javascript) is one of the oldest server-side languages and javascript is embedded in myriad different systems for many different purposes through things like the Mozilla Foundation's "Rhino" component. And SSJS is going through a revival with things like Aptana's Jaxer server, and of course Flash's Actionscript is based on js... Javascript is by no means a single-purpose language or "not designed" for doing the things you describe.
The bit you describe about languages is a poor design and probably poor coding too. It ought to be using HTTP language negotiation, actually. Converting to another CMS would be a massive amount of work and it's entirely possible that the integration code and customizations made to any new CMS might be just as crappy.
And "there's an API" is a very off-the-cuff answer to wikitext compatibility issues. Mediawiki has a great many capabilities that aren't even present in other wiki packages, much less in any generic CMS I know of (and I actually specialize in CMSes.) Just take the system of categories, for example - many CMSes can't even do nested categories, much less do all of the caching that Mediawiki does. I would not want to replace a simple cut and paste of the categories list from the matching Wikipedia article for an image with having to click through some UI to add each category to an image.
There just isn't any net benefit to what you're proposing, accompanied by a massive cost (or if you think it would be easy, just go ahead and set up a proof of concept.)--Struthious Bandersnatch 08:02, 5 October 2009 (UTC)Reply
I do agree that JavaScript can be used for a great amount of things and purposes, but when a website relies heavily on JavaScript to create the initial display of the page, then that is just wrong and a waste of computing power on the client side. Yes, JavaScript is great for dynamic things, but it's certainly not the right idea to create a page on the server, just to add many parts on the client side using ressource-intensive JavaScript. I don't quite get what that has to do with server-side JavaScript.
I agree that there might be Wikitext compatibility issues; the API was just the answer to your concerns that this would not interface well with MediaWiki setups. I do think that implementing the basic syntax (bold, italic, linking etc.) should not be a problem; what else do you need to copy? Categories can be converted to tags, template calls cannot be copied anyway as that would require copying the entire template.
I am not hell-bound on getting a new CMS, I just want the functionality I mentioned implemented in a client-friendly way. Right now, Commons is driving my machine to its limits, tabbed browsing is impossible as some scripts take several seconds (up to 10) to finish. That completely locks up the machine and makes working effectively impossible. I'm just as happy if those features are integrated into MediaWiki, but I assume this is not the way the MediaWiki devs want to go. MediaWiki is good at collaborating text content; images are more a gadget than a feature. The image functionality would need a massive rewrite; I am not sure whether that would take less time than creating this system from scratch and whether the MediaWiki community would want to see those features in their product. Regards, --ChrisiPK 16:21, 6 October 2009 (UTC)Reply

Yes!

It's high time we rid Commons of MediaWiki. Maybe a fork of MW specifically designed for use on Commons will do the trick. -- 77.1.58.224 08:31, 16 October 2009 (UTC)Reply

Implementation issues

The issues raised could be addressed with

instead of a different CMS. One could of course argue that other MediaWiki extensions could supply the same functionality. --Fasten 11:32, 29 October 2009 (UTC)Reply

Of course. The question is whether that is the way the MW developers want their product to go. I'm not opposing adding this functionality, I just don't think that this is MediaWiki's focus. Regards, --ChrisiPK 15:09, 4 November 2009 (UTC)Reply
It could all be implemented with MediaWiki extensions; I don't think the mw:MediaWiki roadmap would be the problem. The Wikimedia Foundation also appears to have a sufficient number of hired programmers to make its own development plans. --Fasten 18:13, 4 November 2009 (UTC)Reply

More?

I think that's a very good proposal, but there is more stuff that I would desire from a new software. I don't want to ruin a featured proposal, so I thought I'd ask here first: Can I add other things that I'd like to have from a possibly new software? --The Evil IP address 21:23, 9 December 2009 (UTC)Reply

You can post your ideas on the talk page. If you make additions to the proposal I recommend a third level section named "Additions" under the proposal section. That makes it easier for readers to distinguish additional ideas from the original proposal. --Fasten 12:55, 12 December 2009 (UTC)Reply

Free CMS

Hello, I know the programmer of Ajax-Browser is pro-GNU and is willing to release his software's code. As a former commons administrator, I think this tool may be interesting for file management and so on (test it on demo page + see features list) and its code 'easily' editable. If some wikimedian programmer want to test the Pro-version of Ajax-Browser for free, and want to modify it to fit Commons' need, send him an email stating you are from wikipedia. He will be happy to send you and the foundation a pro version and thus help back ! : ] --Yug 02:59, 1 July 2010 (UTC)Reply