It’s a bit of a shame we don’t have the tools at hand to fork (esp. partially) and merge text collections as easily as we do source code. I guess we need to invent them?
So the Wikimedia vision is big. Bigger than any of us. Bigger than the Wikipedia and WMF logos. Are we ready for that?
I guess I have been set thinking because of the strategic planning and the not-news that Wikipedia growth is slowing. Wikipedia content is going to outlive us all. It will be the new breed of internet cockroach. Good; that’s the way we designed it. But how will le Googs cope? How will people (searchers) be able to distinguish between potentially dozens of near-identical copies? Maybe those tools need to be created too.
I am thinking about how Linux has (eventually) flourished, and I think it is in no small part to Linus Torvalds’ utterly hands-off leadership. I am thinking about how anyone who wants to can call themselves a Linux user group, or create a Linux conference.
Maybe it is overly simplistic to say this, but the fork is not as feared as it used to be – because now we have much better tools for its flip-side, the merge. Is forking painful but necessary for an ultimately better result?
I am also thinking about Wikimedia chapters, and who gets to use the Wikimedia name, and why and when this is important.
And I am thinking about my favourite shiny tech thing, the write API, and wondering if ‘we’ are ready to uncouple the text collections from MediaWiki, let alone Wikimedia. Because ‘we’ own it… don’t we? Well, we don’t own it, but we’re guarding it with our lives. Wait a minute…
What about uncoupling it from MediaWiki but not Wikimedia? An interface fork, like.
Back to the mission. If all the ‘doing’ ends up uncoupled from WMF, is that okay? If ten other websites fulfill WMF’s mission without WMF having to pay the hosting bills, is that to be considered an ultimate success or failure? Can an organisation be truly invested in a movement, rather than (potentially one day instead of) its own continued existence?
Are we as a community too invested in our own continued existence? …Has it ever been any other way?
I like your ideas :)
The only problem we’ve got is we’re too tightly ingrained with what MediaWiki wants us to do. The database is pretty much useless outside of a MediaWiki installation, and the dumps are too massive to be of any practical use.
What we really need is a custom wiki:// protocol that allows for easy sharing, forking and mirroring of content. People can fork their own versions of the content, and then perhaps request merges back into the master copy.
I wonder how MediaWiki would be different if Git had come before SVN.
— Chad · 12. August 2009, 00:37
Thanks Chad :)
I always thought that for Commons, it would be useful (and probably quite easy) to have an extension that allowed you to download a ZIP of a category, ie everything in the category. This would go some way to ‘partial dumps’ or ‘partial forks’. But I don’t know if it would be too bandwidth intensive to ever get off the ground on Wikimedia.
— pfctdayelise · 12. August 2009, 12:05
How about integrating articles with Google Wave? I vaguely remember something about robot functionality in GW that could allow the MediaWiki api to be accessed from within the wave… That would uncouple the editing process from MediaWiki, at least.
— Borofkin · 13. August 2009, 09:05
Elsewhere on the web...
Commenting is closed for this article.