Metaweb: Difference between revisions
(strategic essential project) |
No edit summary |
||
(4 intermediate revisions by 2 users not shown) | |||
Line 3: | Line 3: | ||
It is an experimental wiki that may become a [[large public wiki]]. Like [[Consumerium]] it is using the [[mediawiki]] software in its R&D phase - and also like Consumerium it intends to create tools to work with material that is in some [[Wikitext standard]] prototype format with well developed multi-language conventions, most likely based on that now used in [[Wikipedia]]. | It is an experimental wiki that may become a [[large public wiki]]. Like [[Consumerium]] it is using the [[mediawiki]] software in its R&D phase - and also like Consumerium it intends to create tools to work with material that is in some [[Wikitext standard]] prototype format with well developed multi-language conventions, most likely based on that now used in [[Wikipedia]]. | ||
It is very clear that they intend NOT to use mediawiki or perl exclusively, or maybe at all, in the long term - they have defined an [http://www.metaweb.com/wiki/wiki.phtml?title=Metaweb:intermediate_page intermediate page format] which is explicitly designed for their own custom software to suck in and generate a [[semantic web]]. Which is something the [[ | It is very clear that they intend NOT to use mediawiki or perl exclusively, or maybe at all, in the long term - they have defined an [http://www.metaweb.com/wiki/wiki.phtml?title=Metaweb:intermediate_page intermediate page format] which is explicitly designed for their own custom software to suck in and generate a [[semantic web]]. Which is something the [[Signal Wiki]] (a semantic web itself, segmented maybe by [[faction]]) must be able to do to the [[Research Wiki]] as a pre-step to affecting the [[Consumerium buying signal]]. So their ultimate solution might also be ours, if we can co-operate with them early enough and use the same tools base. | ||
A first step to this might be to simply adopt a close enough variation of their intermediate page format (with different section titles probably, we don't want a "Stephensonia" section) that we can use their tools here with few adaptations. ''See [[Consumerium:intermediate page format]] for the abstract, and [[Consumerium:intermediate page]] for lists of types of such pages here.'' | |||
It is not clear how they plan to deal with licensing, but they are presently [[GFDL]] and seem to be quite aware that [[mediawiki]] isn't capable of really implementing this license, nor supporting advanced GFDL capabilities like [[Invariant Section]]s that would be required for any kind of certification or validation of article versions. | They are also seemingly taking the lead in thinking about how [[wikitext standard|raw wikitext]] and an arbitrary [[XML DTD]] like [[ConsuML]] can be combined and translated into an XML-like semantic web. An [[XML dump]] might not be page by page and strictly marked up for style, but, might actually be whole topic areas at once, or all [[factionally defined]] terms unique to one [[faction]] in our application. Being able perhaps to re-integrate the semantic web after multiple parties have edited the different factional sections independently... all up for grabs. Again, we should follow their lead, as this is what they plan to do in general. | ||
It is not clear how they plan to deal with licensing, but they are presently [[GFDL]] and seem to be quite aware that [[mediawiki]] isn't capable of really implementing this license, nor supporting advanced GFDL capabilities like [[Invariant Section]]s that would be required for any kind of certification or validation of article versions. [[GetWiki]] may be better, but it's not clear. | |||
The project includes some major brains like [[Danny Hillis]] and [[Neal Stephenson]], and seems [[troll friendly]] enough at the moment to make it possible to at least introduce the correct bridging ideas into both projects. | The project includes some major brains like [[Danny Hillis]] and [[Neal Stephenson]], and seems [[troll friendly]] enough at the moment to make it possible to at least introduce the correct bridging ideas into both projects. | ||
As Neal Stephenson expalins: | |||
:''"My own view of the Metaweb is pretty straightforward: I don't think that the Internet, as it currently exists, does a very good job of explaining things to people. It is great for selling stuff, distributing news and dirty pictures, and a few other things. But when you need to get a good explanation of something, whether it is a scientific principle, a bit of gardening advice, or how to change a tire, you have to sift through a vast number of pages to find the one that gives you the explanation that is right for you. Generally this is not a problem with the explanations themselves. On the contrary, it seems as though a lot of people like to explain things on the Internet, and some of them are quite good at it. The problem lies in how these explanations are organized." '' | |||
Metaweb cooperates with other [[GFDL text corpus]] services, but has a specific focus on Neal Stephenson's ideas. Therefore some concepts like the [[faction]] will often have other names, like the [[phyle]] which is named for a faction-like concept in Stephenson's work. Such differences are noted as [[Metaweb: Stephensonia]]. | |||
See [[wikitext standard]], [[interwiki link standard]], [[interwiki identity standard]] and [[standard wiki URI]] for other potential areas of technology synergy. Metaweb being sponsored by [[Danny Hillis]], it has great potential to advance the state of the art in wiki [[text corpus]] management, starting with the [[GFDL text corpus]]. |
Latest revision as of 15:59, 3 June 2004
Metaweb is visible at http://www.metaweb.com
It is an experimental wiki that may become a large public wiki. Like Consumerium it is using the mediawiki software in its R&D phase - and also like Consumerium it intends to create tools to work with material that is in some Wikitext standard prototype format with well developed multi-language conventions, most likely based on that now used in Wikipedia.
It is very clear that they intend NOT to use mediawiki or perl exclusively, or maybe at all, in the long term - they have defined an intermediate page format which is explicitly designed for their own custom software to suck in and generate a semantic web. Which is something the Signal Wiki (a semantic web itself, segmented maybe by faction) must be able to do to the Research Wiki as a pre-step to affecting the Consumerium buying signal. So their ultimate solution might also be ours, if we can co-operate with them early enough and use the same tools base.
A first step to this might be to simply adopt a close enough variation of their intermediate page format (with different section titles probably, we don't want a "Stephensonia" section) that we can use their tools here with few adaptations. See Consumerium:intermediate page format for the abstract, and Consumerium:intermediate page for lists of types of such pages here.
They are also seemingly taking the lead in thinking about how raw wikitext and an arbitrary XML DTD like ConsuML can be combined and translated into an XML-like semantic web. An XML dump might not be page by page and strictly marked up for style, but, might actually be whole topic areas at once, or all factionally defined terms unique to one faction in our application. Being able perhaps to re-integrate the semantic web after multiple parties have edited the different factional sections independently... all up for grabs. Again, we should follow their lead, as this is what they plan to do in general.
It is not clear how they plan to deal with licensing, but they are presently GFDL and seem to be quite aware that mediawiki isn't capable of really implementing this license, nor supporting advanced GFDL capabilities like Invariant Sections that would be required for any kind of certification or validation of article versions. GetWiki may be better, but it's not clear.
The project includes some major brains like Danny Hillis and Neal Stephenson, and seems troll friendly enough at the moment to make it possible to at least introduce the correct bridging ideas into both projects.
As Neal Stephenson expalins:
- "My own view of the Metaweb is pretty straightforward: I don't think that the Internet, as it currently exists, does a very good job of explaining things to people. It is great for selling stuff, distributing news and dirty pictures, and a few other things. But when you need to get a good explanation of something, whether it is a scientific principle, a bit of gardening advice, or how to change a tire, you have to sift through a vast number of pages to find the one that gives you the explanation that is right for you. Generally this is not a problem with the explanations themselves. On the contrary, it seems as though a lot of people like to explain things on the Internet, and some of them are quite good at it. The problem lies in how these explanations are organized."
Metaweb cooperates with other GFDL text corpus services, but has a specific focus on Neal Stephenson's ideas. Therefore some concepts like the faction will often have other names, like the phyle which is named for a faction-like concept in Stephenson's work. Such differences are noted as Metaweb: Stephensonia.
See wikitext standard, interwiki link standard, interwiki identity standard and standard wiki URI for other potential areas of technology synergy. Metaweb being sponsored by Danny Hillis, it has great potential to advance the state of the art in wiki text corpus management, starting with the GFDL text corpus.