InstantCommons lives -- and why it matters

OK, my NaBloPoMo definitely failed. Nonetheless.

Something that was introduced to MediaWiki with little fanfare was wgForeignAPIRepo. This allows a MediaWiki install to specify another wiki to pull images and other media files from. Like, say… Wikimedia Commons!

This is an idea that has a long history under the name InstantCommons. One of the main reasons Wikimedia Commons is cool is that it can be used as a media resource as if the files were uploaded locally by all the Wikimedia wikis. So “InstantCommons” is about extending this aspect of Wikimedia Commons behaviour to any MediaWiki wiki.

I enabled it for the Wikimedia Australia wiki. All I needed to do was paste these settings into the LocalSettings.php file:

Now, on the front page, there is a colourful map. Clicking through to the image page shows the full image information from Wikimedia Commons, as well as a subtle hint as to the image’s origin:

I wrote to wikitech-l to ask about what the plans for future development of this functionality is. The response was pretty quiet, although Chad is planning to do some dev work on it which is awesome.

There are two big obvious gaps in the functionality at the moment. The first is that third parties using InstantCommons don’t have any way of knowing what happens to the images they are using. While wiki resources such as Wikipedia and Wikimedia Commons may look stable from the outside, editors know that they are anything but. Especially with images, as there is no straightforward way to move/rename images. This means images get deleted and re-uploaded under their preferred name. Copyright violations are also rife among uploaded files, much more so than contributed text I would guess. Not to mention that the community understanding of international copyright is constantly being negotiated and argued. It’s very tricky; not straight-forward at all.

So this is one thing. It is nice to know when images that you use are deleted so that you can remove the ugly redlinks from your pages. But you on your third party wiki have no way to find this out locally: deletions at the other location won’t appear in your local logs. There are a couple of choices:

The first option seems good but depending on what your wiki is and how you run it, you may well want to keep up to date and, for example, remove known copyright violations.

The second is a good option but may not be very popular. A similar concept was created for Wikimedia wikis, known as CommonsTicker. However my observation is that using the CommonsTicker has not been a very popular choice. Wikimedia wikis might get pissed when an image gets deleted that had been on their front page, but that doesn’t mean they’re prepared to drudge through the CommonsTicker log on a regular basis.

The third has been most popular in the Wikimedia universe. CommonsDelinker is a bot that runs over all 800+ wikis of Wikimedia and removes redlinks to images after they have been deleted at Wikimedia Commons (if the local community has not, in the meantime, removed the link themselves). CommonsDelinker also comes with a manual (the code is under the MIT license) and with some minor tweaking might be usable by third parties. It makes more sense for third parties to run the bot themselves, rather than Wikimedia Commons, due to configuration differences.

The other gap in the functionality is that there is no way for the central repository to know which of their files are being used and where. For Wikimedia Commons, we generally like to be able to find out what impact our actions will have — especially if, say, WikiTravel was using our content. Actually this problem is by no means well-solved even for the Wikimedia universe — we rely on a toolserver tool called CheckUsage. If the toolserver goes down, Wikimedia Commons admins have no way of knowing what impact their deletions may have. This still needs a MediaWiki-native solution.

Why is InstantCommons important? The Wikimedia Mission is to disseminate freely-licensed educational content effectively and globally. For Wikimedia Commons, what could constitute more effective dissemination than letting every MediaWiki wiki use its content so easily and transparently? Not a single thing that I can think of.

23 November, 2008 • ,

Comment

1

This is really great. I strongly agree with last paragraph. Hope to see the API implemented for non-MediaWiki software, making Wikimedia Commons works as ubiquitous as possible. Imagine archive.org’s media collection with the rigor of community curation and free culture licensing mandates. It could take over the world. :)

Regarding the second to last paragraph, pingback or something like it may be appropriate.

Mike Linksvayer · 25. November 2008, 05:12

2

Hope to see the API implemented for non-MediaWiki software, making Wikimedia Commons works as ubiquitous as possible.

Me too. Other web apps such as Wordpress are the obvious targets after MediaWiki. That is partly why I have pursued the idea of a community API and a Flickr-like API — copying identi.ca’s idea of emulating the Twitter API. If we can emulate the Flickr API, all the current Flickr plugins would work seamlessly with Wikimedia Commons after just changing one line.

That’s my dream, anyway. :)

pfctdayelise · 25. November 2008, 09:41

Elsewhere on the web...

Commenting is closed for this article.

list of all posts, ever

find articles by tag

monthly archive

most popular articles

  1. [guest] Rethinking the Top Ten
  2. How to use Gmail to manage high-traffic mailing lists
  3. NLA Innovative Ideas Forum audio/video now available
  4. An alternative term for "User-generated content"
  5. Write API enabled on Wikimedia sites!
  6. Top 10 software extensions Wikimedia Commons needs in 2008
  7. Is mass collaboration all it's cracked up to be?
  8. GLAM-WIKI, day one
  9. Free MediaWiki hosting offered by Dreamhost Apps
  10. Reflections on PGIP phase 1

(from the last 30 days)