Jump to content

Talk:Export-import: Difference between revisions

1,498 bytes added ,  9 September 2004
m
de-linking
mNo edit summary
m (de-linking)
 
(3 intermediate revisions by 3 users not shown)
Line 14: Line 14:


:[[Wikipedia]] unrighteously uses a mass of [[GFDL corpus]] content that was donated "to the GFDL itself" not "to Wikipedia" - no ownership rights were ever ceded to [[Wikimedia]] in particular, and even new contributions are not so deeded.  So the rights of those contributors and those who you call "wikipedians" are not the same thing, and attempts to make them the same thing are easy enough to slap down legally.  We're watching all your mistakes.
:[[Wikipedia]] unrighteously uses a mass of [[GFDL corpus]] content that was donated "to the GFDL itself" not "to Wikipedia" - no ownership rights were ever ceded to [[Wikimedia]] in particular, and even new contributions are not so deeded.  So the rights of those contributors and those who you call "wikipedians" are not the same thing, and attempts to make them the same thing are easy enough to slap down legally.  We're watching all your mistakes.
::We care very deeply about preserving the right to fork and the right to freely redistribute Wikipedia content. However our hardware resources are limited. We couldn't possibly serve someone trying to request hundreds of pages per second, although we'd be happy for them to obtain our content in a more orderly fashion. Similarly, we would prefer it if mirrors and forks would cache content locally rather than fetching it from Wikipedia on every client request. It is not against the GFDL to require that they do so. -- [[User:Tim Starling|Tim Starling]] 13:28, 24 Jun 2004 (EEST)
:::This is exactly why we need to focus our efforts on creating a working [[Export-import]] functionality so that we can serve reasonably fresh content from many wikis without driving people to other wikis whenever they could stay within [[Publish Wiki]]. --[[User:Juxo|Juxo]] 14:03, 24 Jun 2004 (EEST)
::::"We couldn't possibly serve someone trying to request hundreds of pages per second," but this just doesn't happen nor could it ever.  A page requested via the so-called "leech facility" (though the correct term would be [[corpus import]] probably) would end up called up once and only referenced again if it was reviewed again - this can be cached without any great difficulty using off the shelf software.
::::Also, set up an [[independent board]] and lots of cash will flow in - none of which will come anywhere near as long as Bomis people run everything and practice [[Wikimedia corruption]]
9,842

edits

We use only those cookies necessary for the functioning of the website.