Talk:Interwiki link standard
Mediawiki supports a deliberately Wikipedia-centric scheme in which for instance "[ [ en: ] ]" means not "in English" but "in the English Wikipedia".
- Not necessarily. I think it's up to how the interwiki-linking is configured. We could have fr: point to fr.consumerium.org, which it does not cause there is no fr.consumerium.org, but w:fr: would still point to the French wikipedia. This is with the current MediaWiki release candidate, so stop whining that other people are not working hard enough or the way you think they should work
- Has nothing to do with the way anyone "thinks". This scheme is still Wikipedia-centric. "FR" is the abbreviation for the French language, not some subdomain of Wikipedia. A french article in the GFDL text corpus should have a name like fr:anomie, and if you don't put "wikipedia:" in the middle to get fr:wikipedia:anomie then it should just try to find any article in french on anomie based on rules, starting with the service the article is on. Classic global lookup. This is in all programming languages. Make it fr:w: and we have no problem. The current release candidate is wrong, since it makes Wikipedia the only possible default, and doesn't put language first in dividing up the GFDL text corpus.
if
[ [ language:service:namespace_within_service:page/subpage#section ] ]
were revised so that
[ [ service:language:namespace_within_service:page/subpage#section ] ]
you would be pretty close to the Interwiki "standard" as it is now implemented already
As a result, a reference to "[ [ en: Metaweb: phyle ] ]" will be interpreted incorrectly as a reference to English Wikipedia where there is no article, instead of correctly to English Metaweb 'phyle' where there is one.
Mediawiki is likely to continue to resist and retard the development of such a standard for the usual reasons (typically software imperialism - see Wikimedia for discussion of this.) Prove us wrong?
- You needn't be bashing MediaWiki developers all the time. They have helped me very much with the problems i've been having setting this wiki up and keeping it up-to-date with the latest required code. In the last upgrade we got XML-export to work and now it's up to someone (maybe us) to develop the multiple source import-functionality briefly discussed on Wikinfo with proteus, who said it isn't the highest of priorities and maybe in GetWiki 2.0, which by classical free software development cycles is far far away
- The worst of those developers is TimStarling who wants to add all these police state features. And as you know there is a strong case to move to MoinMoin or whatever Metaweb comes up with. You can bet that when Danny Hillis starts writing code to deal with the GFDL text corpus, it will sure not be crap in PHP. And full text search will work no matter what the load. It's more a question of the Mediawiki folks, even Proteus/Parrott, not knowing what matters, and being basically script kiddies by comparison to the Python and Metaweb people. Or almost anyone else.