- From: Debattista, Jeremy <Jeremy.Debattista@iais-extern.fraunhofer.de>
- Date: Sat, 3 May 2014 10:35:57 +0000
- To: Phil Archer <phila@w3.org>
- CC: Steven Adler <adler1@us.ibm.com>, Bernadette Farias Lóscio <bfl@cin.ufpe.br>, Eric Kauz <eric.kauz@gs1.org>, DWBP Vocabs <public-dwbp-vocabs@w3.org>
- Message-ID: <33B6A7B1-6EA0-4AA2-A3DE-F91F11B2B547@iais-extern.fraunhofer.de>
Hi Phil, We already defined most of the concepts [1] (probably not in that detail) you are currently specifying, which will be used for the DIACHRON project (@Steve, we have 4 “customers (project partners)” there - Datapublica[2], Data Market[3], EBI[4] and Brox[5]). We will have a namespace as well for that (probably a purl namespace for now). Those are also based on the conceptual daQ model (linked by Ghislain in this thread) I presented to you in our f2f meeting. I suggest that rather than duplicate the effort, we should first define a number of domain-independent quality dimensions (some customers would require different dimensions and metrics). Then we could identify the properties for each of these dimensions (we could either use Makx’s work and Amrapali’s [6] survey paper). If daQ is used, then it should also be easier to integrate quality metadata to CKAN. @Phil, I’d also be happy to help you with the ontology modelling. Cheers, Jer [1] https://raw.githubusercontent.com/diachron/quality/master/src/main/resources/vocabularies/dqm/dqm.trig [2] http://www.data-publica.com [3] https://datamarket.com [4] http://www.ebi.ac.uk<https://datamarket.com> [5] http://brox.de<https://datamarket.com> [6] http://www.semantic-web-journal.net/system/files/swj556.pdf On 02 May 2014, at 18:43, Steven Adler <adler1@us.ibm.com<mailto:adler1@us.ibm.com>> wrote: Phil, Good work! When we have vocabulary specifications ready to be tested, I can find a "customer" to test it as an implementation use case that we can document to make "recommendations." Best Regards, Steve Motto: "Do First, Think, Do it Again" From: Phil Archer <phila@w3.org<mailto:phila@w3.org>> To: DWBP Vocabs <public-dwbp-vocabs@w3.org<mailto:public-dwbp-vocabs@w3.org>>, Eric Kauz <eric.kauz@gs1.org<mailto:eric.kauz@gs1.org>>, Bernadette Farias Lóscio <bfl@cin.ufpe.br<mailto:bfl@cin.ufpe.br>> Date: 05/02/2014 05:45 PM Subject: Some thoughts on the Q&G vocab ________________________________ Dear all, As mentioned on today's call, I've been looking at the data quality and granularity vocabulary. Taking the discussion at the f2f meeting [1], Makx's work under the Eu ISA Programme [2] and the ODI Certificates [3] as my starting points, I worked through the issues and made notes in the wiki. Based on that I then created the diagram. All of which is available at [4]. Eric - you kindly offered to help with the UML modelling, thank you. I've used Enterprise Architect for this - is that what you use by any chance? I think there are several high level talking points: 1. What are we trying to achieve - machine readability? Links to human readable documentation? Objectivity? Subjectivity? 2. How are we going to test this? Bernadette is building a CKAN extension for the data usage vocab - Bernadette - can it take on this vocab as well? (I hope so). The plan so far is for the two vocabs to be Notes, not Recommendations. That means we don't have to prove implementation. However... without implementation nothing is a standard and if we can take the vocabs through to Recommendation (i.e. prove multiple implementations) then they'll have a lot more weight. Any and all comments welcome. If focussing on a particular issue, please start a new thread. Cheers Phil. [1] http://www.w3.org/2013/meeting/dwbp/2014-04-01#Data_quality_task_force [2] http://www.slideshare.net/OpenDataSupport/open-data-quality-29248578 (Slide 8) [3] https://certificates.theodi.org/overview [4] https://www.w3.org/2013/dwbp/wiki/Quality_and_Granularity_Description_Vocabulary -- Phil Archer W3C Data Activity Lead http://www.w3.org/2013/data/ http://philarcher.org<http://philarcher.org/> +44 (0)7887 767755 @philarcher1
Received on Saturday, 3 May 2014 10:36:48 UTC