W3C home > Mailing lists > Public > www-archive@w3.org > October 2018

Notes on the discussions at TPAC on the RDF Canonicalization issue

From: Ivan Herman <ivan@w3.org>
Date: Sun, 28 Oct 2018 07:00:24 +0100
Message-Id: <5B6D6B2F-429C-4741-99BF-0FF376335F91@w3.org>
Cc: W3C Public Archives <www-archive@w3.org>
To: Manu Sporny <msporny@digitalbazaar.com>, Aidan Hogan <aidhog@gmail.com>
(Aidan, it became very much an ad-hoc discussion, so we could not get you in. But you should be part of the discussions, using email. See the notes below.)

Manu and I discussed how to move forward. Here are the main points; Manu, tell me if there is anything to add.

1. A W3C REC in this area is highly unusual, because the core of the spec would be a non-trivial mathematical algorithm. While there are some engineering to do around it (how to store signature in a graph/dataset, the precise set of crypto methods to use for, e.g., hashing, etc), those are easy to do in comparison. In agreement with also Ralph Swick (Aidan: COO of W3C) we need some sort of a solid review of the mathematics involved before moving on.

2. We have two inputs: one are the two papers of Aidan[1][2], and the other the draft published by Manu & David[3]. (Any others?)

3. Aidan's algorithm has undergone a rigorous peer review for the WWW and the journal versions; although the algorithm has to be extended from RDF graphs to RDF datasets, that step seems to be obvious (and has been already outlined by Ivan)[4]. 

4. There is a need to get a similar writeup and peer review for the algorithm of Manu & David. Digital Bazaar will start the process of (a) writing down the algorithm in mathematical terms, much as Aidan's paper, and (b) get the results essentially peer-reviewed. We will have to find reviewers for step (b). (Whether that result would be published in a journal/conference is orthogonal to the review. Actually, a 'direct' peer review is probably faster…)

5. As a tentative goal, it would be great to give an overview of what we may try (together with the first results of the peer review for [3]) at the W3C Workshop on Web Standardization for Graph Data[5]. (That may be a bit too tight, though.)

6. Once [4] is done, we will have to start discussions on how to "merge" (algorithmically) the two inputs to end up with a unified, standard one, although part of that work can also be done in a Working Group.

7. The real goal is to achieve the process such that a draft WG charter could be prepared, and presented, at the TPAC 2019, ie, September 2019. The scope of such a WG still has to be discussed: does it include signatures of Linked Data in general, or should it focus on the canonicalization only. To be seen.

Manu, did I forget anything? Aidan, does this sound reasonable to you?


[1] http://www.www2015.it/documents/proceedings/proceedings/p430.pdf <http://www.www2015.it/documents/proceedings/proceedings/p430.pdf>
[2] http://aidanhogan.com/docs/rdf-canonicalisation.pdf <http://aidanhogan.com/docs/rdf-canonicalisation.pdf>
[3] http://json-ld.github.io/normalization/spec/index.html <http://json-ld.github.io/normalization/spec/index.html>
[4] https://github.com/iherman/canonical_rdf <https://github.com/iherman/canonical_rdf>
[5] https://www.w3.org/Data/events/data-ws-2019/ <https://www.w3.org/Data/events/data-ws-2019/>

Ivan Herman, W3C 
Home: http://www.w3.org/People/Ivan/ <http://www.w3.org/People/Ivan/>
mobile: +31-641044153
ORCID ID: https://orcid.org/0000-0003-0782-2704 <https://orcid.org/0000-0003-0782-2704>

Received on Sunday, 28 October 2018 06:00:30 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 22:35:59 UTC