W3C home > Mailing lists > Public > semantic-web@w3.org > February 2015

RE: [Dbpedia-discussion] Advancing the DBpedia ontology

From: Vladimir Alexiev <vladimir.alexiev@ontotext.com>
Date: Wed, 18 Feb 2015 15:03:18 +0200
To: "'dbpedia-ontology'" <dbpedia-ontology@lists.sourceforge.net>
Cc: "'Linked Data community'" <public-lod@w3.org>, "'SW-forum'" <semantic-web@w3.org>, <dbpedia-discussion@lists.sourceforge.net>
Message-ID: <00d401d04b7b$448100b0$cd830210$@alexiev@ontotext.com>
Hi everyone!

My presentations from the Dublin meeting are at

- http://VladimirAlexiev.github.io/pres/20150209-dbpedia/add-mapping-long.html
An example of adding a mapping, while making a couple props along the way and reporting a couple of problems.

- http://VladimirAlexiev.github.io/pres/20150209-dbpedia/dbpedia-problems-long.html
Provides a wider perspective that data problems are not only due to the ontology, but many other areas.
3. Mapping Language Issues
4. Mapping Server Deficiencies
5. Mapping Wiki Deficiencies
6. Mapping Issues
7. Extraction Framework Issues
8. External Mapping Problems
9. Ontology Problems
Almost all of these are also reported in the two trackers described in sec.2

> Heiko Paulheim <heiko@informatik.uni-mannheim.de> wrote:
> I am currently working with Aldo Gangemi on exploiting the mappings to DOLCE
> (and the high level disjointness axioms in DOLCE) 
> for finding modeling issues both in the instances and the ontology.

Sounds very interesting!

I've been quite active in the last couple of months, but I've been pecking at random here and there.
More systematic approaches are definitely needed, 
as soon as they are not limited to a theoretical experiment, or a one-time effort that's quickly closed down.

I've observed many error patterns, and if people smarter than me can devise ways to leverage and amplify these observations using algorithmic or ML approaches, that could create fast progress. I give some examples of the "Need for Research" of specific problems: 
and next section.

> Harald Sack:
> apply the DBpedia ontology to detect inconsistencies and flaws in DBpedia facts.
> This should not only be possible in a retroactive way, but should take place much earlier.
> Besides the detection of inconsistencies during the mapping process or afterwards in the extracted data

Sounds very promising! If I can help somehow with "manual ontological & wiki labor", let me know.
Data vs ontology validation can provide
- mapping defect lists
- useful hints that the Extraction can use. 
  The most important feature would be Use Domain & Range to Guide Extraction

> this could already be possible right from the start when the user is changing the wikipedia infobox content
> (in the sense of type checking for domain/range, checking of class disjointness and further constraints, plausibility check for dates

I'm doubtful of the utility of "error lists" to Wikipedia (or it needs to be done with skill and tact):
1. The mapping wiki adopts an Object vs DataProp Dichotomy (uses owl:ObjectProperty and owl:DatatypeProperty and never rdf:Property).
  But MANY Wikipedia fields include both links and text, and in many cases BOTH are useful 
2. At the end of the day, Wikipedia is highly-crafted text, so telling Wikipedia editors that they can't write something, will not sit well with them.

For example, who should resolve this contradiction:
  DBO: dbo:parent rdfs:range dbo:Person
  Wikipedia: | mother = [[Queen Victoria]] of [[England]]
I think the Extraction Framework (by filtering out the link that is not Person), not Wikipedians

> a tool that makes inconsistencies/flaws in wikipedia data visible directly in the wikipedia interface,
> where users could either correct them or confirm facts that are originally in doubt.

But Wikipedia is moving towards using Wikidata props in template fields: through {{#property}}.

Received on Wednesday, 18 February 2015 13:03:44 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:49:33 UTC