W3C home > Mailing lists > Public > semantic-web@w3.org > August 2013

Fwd: [Wikidata-l] Wikidata RDF export available

From: Markus Krötzsch <markus.kroetzsch@cs.ox.ac.uk>
Date: Sat, 03 Aug 2013 15:01:38 +0100
Message-ID: <51FD0D42.7040909@cs.ox.ac.uk>
To: "semantic-web@w3.org" <semantic-web@w3.org>
FYI: Wikidata now provides initial RDF exports.

There are some OWL axioms in there, too, to encode that something has 
*no* value for a certain property, but most is plain RDF. The data is 
not too big yet, as the site is still very young and under continued 
development (e.g., coordinates could only be entered since June, times 
only since end of May). But don't worry -- Wikidata is extremely active 
and things will continue to grow fast ;-)

Cheers,

Markus


-------- Original Message --------
Subject: [Wikidata-l] Wikidata RDF export available
Date: Sat, 03 Aug 2013 14:48:14 +0100
From: Markus Krötzsch <markus@semantic-mediawiki.org>
Reply-To: Discussion list for the Wikidata project. 
<wikidata-l@lists.wikimedia.org>
To: Discussion list for the Wikidata project. 
<wikidata-l@lists.wikimedia.org>

Hi,

I am happy to report that an initial, yet fully functional RDF export
for Wikidata is now available. The exports can be created using the
wda-export-data.py script of the wda toolkit [1]. This script downloads
recent Wikidata database dumps and processes them to create RDF/Turtle
files. Various options are available to customize the output (e.g., to
export statements but not references, or to export only texts in English
and Wolof). The file creation takes a few (about three) hours on my
machine depending on what exactly is exported.

For your convenience, I have created some example exports based on
yesterday's dumps. These can be found at [2]. There are three Turtle
files: site links only, labels/descriptions/aliases only, statements
only. The fourth file is a preliminary version of the Wikibase ontology
that is used in the exports.

The export format is based on our earlier proposal [3], but it adds a
lot of details that had not been specified there yet (namespaces,
references, ID generation, compound datavalue encoding, etc.). Details
might still change, of course. We might provide regular dumps at another
location once the format is stable.

As a side effect of these activities, the wda toolkit [1] is also
getting more convenient to use. Creating code for exporting the data
into other formats is quite easy.

Features and known limitations of the wda RDF export:

(1) All current Wikidata datatypes are supported. Commons-media data is
correctly exported as URLs (not as strings).

(2) One-pass processing. Dumps are processed only once, even though this
means that we may not know the types of all properties when we first
need them: the script queries wikidata.org to find missing information.
This is only relevant when exporting statements.

(3) Limited language support. The script uses Wikidata's internal
language codes for string literals in RDF. In some cases, this might not
be correct. It would be great if somebody could create a mapping from
Wikidata language codes to BCP47 language codes (let me know if you
think you can do this, and I'll tell you where to put it)

(4) Limited site language support. To specify the language of linked
wiki sites, the script extracts a language code from the URL of the
site. Again, this might not be correct in all cases, and it would be
great if somebody had a proper mapping from Wikipedias/Wikivoyages to
language codes.

(5) Some data excluded. Data that cannot currently be edited is not
exported, even if it is found in the dumps. Examples include statement
ranks and timezones for time datavalues. I also currently exclude labels
and descriptions for simple English, formal German, and informal Dutch,
since these would pollute the label space for English, German, and Dutch
without adding much benefit (other than possibly for simple English
descriptions, I cannot see any case where these languages should ever
have different Wikidata texts at all).

Feedback is welcome.

Cheers,

Markus

[1] https://github.com/mkroetzsch/wda
      Run "python wda-export.data.py --help" for usage instructions
[2] http://semanticweb.org/RDF/Wikidata/
[3] http://meta.wikimedia.org/wiki/Wikidata/Development/RDF

-- 
Markus Kroetzsch, Departmental Lecturer
Department of Computer Science, University of Oxford
Room 306, Parks Road, OX1 3QD Oxford, United Kingdom
+44 (0)1865 283529               http://korrekt.org/

_______________________________________________
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Received on Saturday, 3 August 2013 14:02:04 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 1 March 2016 07:42:44 UTC