Knowledge graphs: Difference between revisions
(moved contents from Semantic MediaWiki unchanged to gain better understanding of the possibilities and issues) |
(+ '''SPARQL endpoint''' at databus.dbpedia.org) |
||
(27 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
Acquiring access for our consumers to a [[w:semantic network|semantic network]] of relevant [[w:linked data|linked]] [[w:open data|open data]] compiled by other efforts and structured by a number of [[w:ontology (information science)|ontologies]] is obviously key to Consumerium. Reciprocally we aim to share the information we gather and compile available to other efforts. | |||
See also: [[mw:Manual:Managing data in MediaWiki]] at mediawiki.org | |||
= Wikidata = | |||
[[File:Wikidata-logo-en.svg|thumb|right|260px|The [https://wikidata.org Wikidata] logo]] | |||
'''[[w:Wikidata|Wikidata]]''' [https://wikidata.org (.org)] is a [[w:knowledge base|knowledge base]], an effort to store and serve structured data to [[Wikimedia]] wikis and to a more limited extent to other parties. Wikidata effort saw the daylight in [[2012]]. | |||
The underlying software is the '''[[#Wikibase|Wikibase]]''' which consists of 2 [[Mediawiki]] extensions, the [[mw:Extension:Wikibase Repository|Wikibase Repository]] and the [[mw:Extension:Wikibase Client|Wikibase Client]]. | |||
Wikibase allows [[interwiki]] links to be managed with Wikidata removing much contributor annoyanges, redundancy and error-proneness. | |||
Wikidata is obviously a very viable source of [[reference]] level data once it is technically possible for non-WMF wikis to access the data items. (See [[#LinkedWiki extension]] for a potential workaround for this limitation) | |||
It can be accessed outside of WMF wikis with with | |||
* [https://query.wikidata.org/ Wikidata's SPARQL endpoint] using [[SPARQL]] ([[wikidata:Wikidata:SPARQL_query_service/queries|Wikidata advice on how to query]]). | |||
* [[mw:Wikidata Toolkit]] is a way for Java programs to access data in the Wikidata repository. | |||
* [https://rdflib.github.io/sparqlwrapper/ SPARQL Endpoint interface to Python] | |||
* [[wikidata:Wikidata:Database download|Wikidata database download]] | |||
* The [https://tools.wmflabs.org/wikidata-game/# Wikidata Game] [[w:gamification|gamifies]] adding information to Wikidata. | |||
Main entry point of any Wikidata item is a [[w:JSON|JSON]] dictionary, that has this form: | |||
<code>{“labels”: by-language dictionary | |||
“descriptions”: by-language dictionary | |||
“aliases”: by-language dictionary | |||
“claims”: list of property and values | |||
“sitelinks”: by-language dictionary}</code> | |||
== Lexicographical Wikidata == | |||
A '''lexeme''' is a unit of [[w:lexical semantics|lexical]] meaning that underlies a set of words that are related through [[w:inflection|inflection]]. It is a basic abstract unit of meaning,<ref>''The Cambridge Encyclopedia of The English Language''. Ed. [[w:David Crystal|]]. Cambridge: Cambridge University Press, 1995. p. 118. {{ISBN|0521401798}}.</ref> a [[w:emic unit|unit]] of [[w:Morphology (linguistics)|morphological]] [[w:Semantic analysis (linguistics)|analysis]] in [[w:linguistics]] that roughly corresponds to a set of forms taken by a single root [[w:word]]. For example, in [[w:English language|English]], ''run'', ''runs'', ''ran'' and ''running'' are forms of the same lexeme, which can be represented as <span style="font-variant:small-caps; text-transform:lowercase;">RUN</span> (Wikipedia on 2019-12-29) | |||
Since 2018, Wikidata has also stored a new type of data: words, phrases and sentences, in many languages, described in many languages. This information is stored in new types of entities, called '''Lexemes''' ('''L'''), '''Forms''' ('''F''') and '''Senses''' ('''S''').<ref>[[wikibooks:SPARQL/WIKIDATA Lexicographical data]]</ref>. This is enabled by [[mw:Extension:WikibaseLexeme|the WikibaseLexeme extension]]. | |||
* [[wikidata:Wikidata:Lexicographical_data/Documentation|The main documentation page for lexicographical data on Wikidata]] | |||
* [[wikidata:Wikidata:Tools/Lexicographical data|Wikidata's list of lexicographical properties and tools using the lexicographical data]] | |||
* [[wikibooks:SPARQL/WIKIDATA Lexicographical data|Wikibook on Wikidata's lexicographical data]] | |||
== Useful information == | |||
* [[wikidata:Wikidata:Tools/External_tools|Wikidata's list of external tools that make use of Wikidata's knowledge]] | |||
'''More info''' | |||
* [[m:Wikidata|Metawiki on Wikidata]] | |||
* [[w:Wikipedia:Wikidata|Wikipedia advice on Wikidata issues]] | |||
== Wikibase == | |||
[[File:Wikibase_logo.png|thumb|right|250px|The [http://wikiba.se/ Wikibase] logo]] | |||
'''[[mw:Wikibase|Wikibase]]''' [http://wikiba.se/ (wikiba.se)] is a system for storing and querying structured data that powers [[Wikidata]] and other wikis. | |||
Wikibase consists of two extensions: | |||
:# [[mw:Extension:Wikibase Repository|Wikibase Repository]] that allows a wiki to work as a repository for structured data. | |||
:# [[mw:Extension:Wikibase Client|Wikibase Client]] that allows a wiki to access structured data from a repository. The client can work only with repository databases it can access so they must be on the same machine or the same load balancer. | |||
=== Installation of Wikibase === | |||
[[mw:Wikibase/Installation|Wikibase installation instructions at Mediawiki.org]] and [[mw:Wikibase/Installation/Advanced_configuration|advanced configuration of Wikibase]]. | |||
The installation instructions assume you are installing the dependencies with [[mw:Composer|Composer]], a PHP package manager that makes the installation of dependencies easy. | |||
=== Useful extensions in conjunction with Wikibase === | |||
* [[mw:Extension:ArticlePlaceholder]] makes article placeholders from repository data and invites users to create the article. | |||
* [[mw:Extension:UniversalLanguageSelector]] is recommended to work in conjunction with Wikibase for user comfort. | |||
== Alternative to using Wikibase Client == | |||
* [[#LinkedWiki extension]] can be configured to access multiple SPARQL endpoints. | |||
=== Useful information === | |||
* [[mw:Wikibase/DataModel|The data model used in Wikibase]] (thorough) and [[mw:Wikibase/DataModel/Primer|the primer on the data model]] (quick access) | |||
* [https://github.com/wikimedia/mediawiki-extensions-Wikibase/blob/master/docs/federation.wiki Information on the federation of Wikibase at Github] | |||
* [https://phabricator.wikimedia.org/T159240 Phabricator Task to "Document how to set up federated Wikibase instances"] | |||
* [https://phabricator.wikimedia.org/T196997 Phabricator Task to "Wikidata/Wikibase federation and distribution"] and the [https://docs.google.com/document/d/1YYIuQzcWz2cH9zTUbbfiBrTecVxMWAu3_9-U4DcKzEU/edit#heading=h.z49pje934vur Google Docs document charting out the task] | |||
---- | |||
= Semantic MediaWiki = | |||
[[File:SemanticMediaWiki_Logo.png|thumb|right|200px|The [https://www.semantic-mediawiki.org/wiki/Semantic_MediaWiki Semantic MediaWiki] logo]] | |||
'''Semantic MediaWiki''' [https://www.semantic-mediawiki.org/wiki/Semantic_MediaWiki (.org)] (SMW) is a free, open-source [[MediaWiki extensions|extension]] to [[MediaWiki]] that lets you store & query semantic data within the wiki and it seems well suited to [[Consumerium]]'s information infrastructure needs. | |||
== Spinoff extensions == | |||
A variety of open-source MediaWiki extensions exist that use the data structure provided by Semantic MediaWiki. | |||
Among the most notable are of the [http://semantic-mediawiki.org/wiki/Help:SMW_extensions Semantic MediaWiki extensions]: | |||
* [[mw:Extension:Semantic Forms|Semantic Forms]] - enables user-created forms for adding and editing pages that use semantic data | |||
* [[mw:Extension:Semantic Result Formats|Semantic Result Formats]] - provides a large number of display formats for semantic data, including charts, graphs, calendars and mathematical functions | |||
* [[mw:Extension:Semantic Drilldown|Semantic Drilldown]] - provides a [[w:faceted browser]] interface for viewing the semantic data in a wiki | |||
* [[mw:Extension:Semantic Maps|Semantic Maps]] - displays geographic semantic data using various mapping services | |||
---- | |||
= Open Food Facts = | |||
{{:Open Food Facts}} | |||
---- | |||
= DBpedia = | = DBpedia = | ||
'''[[w:DBpedia|DBpedia]]''' [ | [[File:DBpediaLogo.svg|thumb|right|260px|The [https://www.dbpedia.org/ DBpedia] logo]] | ||
'''[[w:DBpedia|DBpedia]]''' [https://www.dbpedia.org/ (.org)] is a community effort to enable the web moving ''"Towards a Public Data Infrastructure for a Large, Multilingual, Semantic Knowledge Graph"''. | |||
Today the DBpedia '''[[w:data set|data set]]s''' contain a wealth of information structured into '''[[w:Ontology (information science)|ontologies]]'''. This [[w:structured data|structured data]] can be queried with [[SPARQL]] query language at their [http://dbpedia.org/sparql public DBpedia SPARQL endpoint]. | Today the DBpedia '''[[w:data set|data set]]s''' contain a wealth of information structured into '''[[w:Ontology (information science)|ontologies]]'''. This [[w:structured data|structured data]] can be queried with [[SPARQL]] query language at their [http://dbpedia.org/sparql public DBpedia SPARQL endpoint]. | ||
There are many methods how the DBpedia ontology and datasets could be used in the Consumerium implementation stage wiki | DBpedia uses [[w:Virtuoso Universal Server]] to store and query the data. [https://www.w3.org/wiki/VirtuosoUniversalServer VirtuosoUniversalServer at w3.org] | ||
There are many methods how the DBpedia ontology and datasets could be used in the Consumerium implementation stage wiki. | |||
== Ontology classes useful for implementing Consumerium == | == Ontology classes useful for implementing Consumerium == | ||
'''[ | '''[https://dief.tools.dbpedia.org/server/ontology/classes/ All DBpedia ontology classes]''' | ||
* [http://mappings.dbpedia.org/server/ontology/classes/Company Company] | * [http://mappings.dbpedia.org/server/ontology/classes/Company Company] | ||
Line 47: | Line 150: | ||
* '''[http://wiki.dbpedia.org/Datasets DBpedia datasets]''' have been released annually, sometimes with improvements more frequently. The latest (as of July 2018) [https://wiki.dbpedia.org/develop/datasets/dbpedia-version-2016-10 DBpedia dataset version is 2016-10] which was published in 2017. | * '''[http://wiki.dbpedia.org/Datasets DBpedia datasets]''' have been released annually, sometimes with improvements more frequently. The latest (as of July 2018) [https://wiki.dbpedia.org/develop/datasets/dbpedia-version-2016-10 DBpedia dataset version is 2016-10] which was published in 2017. | ||
== Databus == | == DBpedia Databus == | ||
The [https://databus.dbpedia.org/ '''DBpedia Databus''' at databus.dbpedia.org] ''is a data cataloging and versioning platform for data developers and consumers.'' | |||
[https://databus.dbpedia.org/sparql '''SPARQL endpoint''' at databus.dbpedia.org] | |||
DBpedia developed the DBpedia Databus in the late 2010's with a Databus alpha published in May 2018. | |||
== History of DBpedia == | == History of DBpedia == | ||
DBpedia began as an effort to extract structured information from [[Wikipedia]] [[templates|infobox templates]] and [[categories]] and to make this information available on the Web with the initial release on January 10th 2007. | DBpedia began as an effort to extract structured information from [[Wikipedia]] [[templates|infobox templates]] and [[categories]] and to make this information available on the Web with the initial release on January 10th 2007. | ||
== More info on DBpedia | == More info on DBpedia == | ||
* [https://wiki.dbpedia.org/ DBpedia wiki] | * [https://wiki.dbpedia.org/ DBpedia wiki] | ||
* [https://blog.dbpedia.org/ DBpedia blog] | * [https://blog.dbpedia.org/ DBpedia blog] | ||
* [https://github.com/dbpedia/ DBpedia code at GitHub] | * [https://github.com/dbpedia/ DBpedia code at GitHub] | ||
---- | ---- | ||
= LinkedWiki extension = | |||
[[File:LogoLinkedWiki.png|thumb|right|260px|Logo of the [[mw:Extension:LinkedWiki|LinkedWiki extension]]]] | |||
A possible way to tap into various knowledge graphs is the [[mw:Extension:LinkedWiki|LinkedWiki extension]]. LinkedWiki has been developed since 2010 by [[mw:User:Karima Rafes]], a [[#Semantic MediaWiki]] developer and CEO of [http://www.bordercloud.com/ BorderCloud.com] | |||
''' See also ''' | |||
* [https://linkedwiki.com/ LinkedWiki.com - The best place to discover available Linked Data and to share SPARQL queries] | |||
''' | |||
* [[Database]] for a higher level view of what is going on with the databases | * [[Database]] for a higher level view of what is going on with the databases | ||
== | = References = | ||
<references/> | |||
[[ | [[Category:List]] | ||
[[Category:Technology]] |
Latest revision as of 18:45, 4 January 2024
Acquiring access for our consumers to a semantic network of relevant linked open data compiled by other efforts and structured by a number of ontologies is obviously key to Consumerium. Reciprocally we aim to share the information we gather and compile available to other efforts.
See also: mw:Manual:Managing data in MediaWiki at mediawiki.org
Wikidata[edit | edit source]
Wikidata (.org) is a knowledge base, an effort to store and serve structured data to Wikimedia wikis and to a more limited extent to other parties. Wikidata effort saw the daylight in 2012.
The underlying software is the Wikibase which consists of 2 Mediawiki extensions, the Wikibase Repository and the Wikibase Client.
Wikibase allows interwiki links to be managed with Wikidata removing much contributor annoyanges, redundancy and error-proneness.
Wikidata is obviously a very viable source of reference level data once it is technically possible for non-WMF wikis to access the data items. (See #LinkedWiki extension for a potential workaround for this limitation)
It can be accessed outside of WMF wikis with with
- Wikidata's SPARQL endpoint using SPARQL (Wikidata advice on how to query).
- mw:Wikidata Toolkit is a way for Java programs to access data in the Wikidata repository.
- SPARQL Endpoint interface to Python
- Wikidata database download
- The Wikidata Game gamifies adding information to Wikidata.
Main entry point of any Wikidata item is a JSON dictionary, that has this form:
{“labels”: by-language dictionary
“descriptions”: by-language dictionary
“aliases”: by-language dictionary
“claims”: list of property and values
“sitelinks”: by-language dictionary}
Lexicographical Wikidata[edit | edit source]
A lexeme is a unit of lexical meaning that underlies a set of words that are related through inflection. It is a basic abstract unit of meaning,[1] a unit of morphological analysis in w:linguistics that roughly corresponds to a set of forms taken by a single root w:word. For example, in English, run, runs, ran and running are forms of the same lexeme, which can be represented as RUN (Wikipedia on 2019-12-29)
Since 2018, Wikidata has also stored a new type of data: words, phrases and sentences, in many languages, described in many languages. This information is stored in new types of entities, called Lexemes (L), Forms (F) and Senses (S).[2]. This is enabled by the WikibaseLexeme extension.
- The main documentation page for lexicographical data on Wikidata
- Wikidata's list of lexicographical properties and tools using the lexicographical data
- Wikibook on Wikidata's lexicographical data
Useful information[edit | edit source]
More info
Wikibase[edit | edit source]
Wikibase (wikiba.se) is a system for storing and querying structured data that powers Wikidata and other wikis.
Wikibase consists of two extensions:
- Wikibase Repository that allows a wiki to work as a repository for structured data.
- Wikibase Client that allows a wiki to access structured data from a repository. The client can work only with repository databases it can access so they must be on the same machine or the same load balancer.
Installation of Wikibase[edit | edit source]
Wikibase installation instructions at Mediawiki.org and advanced configuration of Wikibase.
The installation instructions assume you are installing the dependencies with Composer, a PHP package manager that makes the installation of dependencies easy.
Useful extensions in conjunction with Wikibase[edit | edit source]
- mw:Extension:ArticlePlaceholder makes article placeholders from repository data and invites users to create the article.
- mw:Extension:UniversalLanguageSelector is recommended to work in conjunction with Wikibase for user comfort.
Alternative to using Wikibase Client[edit | edit source]
- #LinkedWiki extension can be configured to access multiple SPARQL endpoints.
Useful information[edit | edit source]
- The data model used in Wikibase (thorough) and the primer on the data model (quick access)
- Information on the federation of Wikibase at Github
- Phabricator Task to "Document how to set up federated Wikibase instances"
- Phabricator Task to "Wikidata/Wikibase federation and distribution" and the Google Docs document charting out the task
Semantic MediaWiki[edit | edit source]
Semantic MediaWiki (.org) (SMW) is a free, open-source extension to MediaWiki that lets you store & query semantic data within the wiki and it seems well suited to Consumerium's information infrastructure needs.
Spinoff extensions[edit | edit source]
A variety of open-source MediaWiki extensions exist that use the data structure provided by Semantic MediaWiki.
Among the most notable are of the Semantic MediaWiki extensions:
- Semantic Forms - enables user-created forms for adding and editing pages that use semantic data
- Semantic Result Formats - provides a large number of display formats for semantic data, including charts, graphs, calendars and mathematical functions
- Semantic Drilldown - provides a w:faceted browser interface for viewing the semantic data in a wiki
- Semantic Maps - displays geographic semantic data using various mapping services
Open Food Facts[edit | edit source]
Open Food Facts (.org) is a crowdsourced non-profit open database, web app and mobile apps of food products initiated in 2012 by French programmer Stéphane Gigandet. It is limited to foodstuffs only by the contract, others will be removed and clearly seeks not to be political.
The database is copylefted under w:Open Database License, database entries under Database Contents License and the uploaded photos under CC-BY-3.0 Creative Commons so the database and the photos can be reused in other services so it would seem a good place to contribute as open data is available in various formats.
Information verification is based on photos of the labeling on the package. Adding information not featured on the label seems to be forbidden, probably to avoid legal problems arising.
In Wikipedia
Apps
Sites by OFF
- Official website
- Open Food Facts English wiki
- Open Food Facts on GitHub.com
- Translation platform for OFF powered by Crowdin.com localization platform
- Open Food Facts datasets at datahub.io
Interesting wiki articles
Sites reusing OFF data set
- 'Produits alimentaires : ingrédients, nutrition, labels' at data.gouv.fr offers downloads of the data set in CSV, RDF and JSON formats and lists a plethora of services that are reusing OFF data sets (in French)
- CIQUAL - French Food Composition Table by the w:Agence nationale de sécurité sanitaire de l'alimentation, de l'environnement et du travail (Anses) in French and English
- Combien de sucres (.fr) - a game to guess how many sugar cubes are in a food product (in French)
- C'est emballé près de chez vous (.fr) - a map interface to manufacturing / packaging locations (in French)
- Informations Nutritionelles (.fr) - detailed information about nutrition values in products and product groups (in French)
Spinoffs / expansions of scope
- https://world.openproductsfacts.org/
- https://world.openbeautyfacts.org/
- https://world.openpetfoodfacts.org/
DBpedia[edit | edit source]
DBpedia (.org) is a community effort to enable the web moving "Towards a Public Data Infrastructure for a Large, Multilingual, Semantic Knowledge Graph".
Today the DBpedia data sets contain a wealth of information structured into ontologies. This structured data can be queried with SPARQL query language at their public DBpedia SPARQL endpoint.
DBpedia uses w:Virtuoso Universal Server to store and query the data. VirtuosoUniversalServer at w3.org
There are many methods how the DBpedia ontology and datasets could be used in the Consumerium implementation stage wiki.
Ontology classes useful for implementing Consumerium[edit | edit source]
DBpedia datasets[edit | edit source]
- DBpedia datasets have been released annually, sometimes with improvements more frequently. The latest (as of July 2018) DBpedia dataset version is 2016-10 which was published in 2017.
DBpedia Databus[edit | edit source]
The DBpedia Databus at databus.dbpedia.org is a data cataloging and versioning platform for data developers and consumers.
SPARQL endpoint at databus.dbpedia.org
DBpedia developed the DBpedia Databus in the late 2010's with a Databus alpha published in May 2018.
History of DBpedia[edit | edit source]
DBpedia began as an effort to extract structured information from Wikipedia infobox templates and categories and to make this information available on the Web with the initial release on January 10th 2007.
More info on DBpedia[edit | edit source]
LinkedWiki extension[edit | edit source]
A possible way to tap into various knowledge graphs is the LinkedWiki extension. LinkedWiki has been developed since 2010 by mw:User:Karima Rafes, a #Semantic MediaWiki developer and CEO of BorderCloud.com
See also
- LinkedWiki.com - The best place to discover available Linked Data and to share SPARQL queries
- Database for a higher level view of what is going on with the databases
References[edit | edit source]
- ↑ The Cambridge Encyclopedia of The English Language. Ed. [[w:David Crystal|]]. Cambridge: Cambridge University Press, 1995. p. 118. Template:ISBN.
- ↑ wikibooks:SPARQL/WIKIDATA Lexicographical data