- From: Dave Longley <dlongley@digitalbazaar.com>
- Date: Wed, 19 Nov 2014 11:38:27 -0500
- To: Owen Shepherd <owen.shepherd@e43.eu>, public-socialweb@w3.org
- CC: public-credentials@w3.org
- Message-ID: <546CC783.8040800@digitalbazaar.com>
On 11/18/2014 11:15 AM, Owen Shepherd wrote: > > "Identified as a need for both the Social Web WG and Web Annotations > WG due to dependence on JSON-LD" > > Says who? SocialWGs deliverables explicitly make JSON-LD processing a > non requirement. If we require LD processing, then then I would > express absolutely no surprise when everyone ignores it. In other > words, all of our work would be for naught and our standard a failure. > > Which brings us on to > > "Graph Normalization algorithm is hidden from developers, but very > complex" > > Erm, how? Surely some developer must implement it? > > Or are you suggesting that this is a non-issue because its hidden by a > library how is this supposed to work until the standard is > sufficiently ubiquitous that I can get a complete and adaptable > implementation no matter what language I use? Your mitigating factor > isnt a valid mitigating factor. That should just say > > "Graph Normalization algorithm is very complex" > > Experience shows that complicated specifications generally fail unless > they bring something truly compelling to the table. > We should all be careful making statements like "the algorithm is very complex". They are too subjective. The algorithm may not actually be considered that complex by developers. Unfortunately, I need to update the spec (it's extremely dated) so it's more clear what the actual complexities are (or aren't) -- I just haven't had the time. Here's a very high-level overview of what the algorithm does: 1. Convert RDF syntax to abstract quads. 2. Map each blank node to all quads in which it is mentioned. 3. While the number of unnamed blank nodes is decreasing: 3.1. For each unnamed blank node, hash all of its related quads (hash sorted serialized N-Quads w/minimal whitespace). 3.1.1. Store whether the resulting hash is unique or not. 3.2. Assign a name to each blank node with a unique hash in sorted order. 4. For each duplicate hash group in sorted order: 4.1. For each unnamed blank node in the group: 4.1.1. Group adjacent blank nodes by a hash of relationship and quad hash. 4.1.2. For each group: 4.1.2.1. Digest the group hash. 4.1.2.2. Build the shortest, lexicographically least "path" of blank nodes in the group that visits all nodes in the group without cycling and that serializes them using the hash that results from recursively calling this algorithm. Track the order the blank nodes were visited. 4.1.2.3. Digest the chosen path. 4.1.3. Process the resulting group hashes in sorted order, naming any remaining blank nodes by using the order in which the nodes were visited when constructing the associated shortest "path". 5. Done. There are already implementations of this algorithm in JavaScript, Java, Python, and PHP. > When the alternative is the widely adopted, part of the webcrypto, > simpler, more flexible JOSE... How do you expect to win? > > (Not that I think SM is entirely meritless - e.g. clear signing is > useful - just that it is probably dead in the water as long as LD > processing is mandated) > > On 18 Nov 2014 15:56, "☮ elf Pavlik ☮" <perpetual-tripper@wwelves.org > <mailto:perpetual-tripper@wwelves.org>> wrote: > > I remember some conversation during TPAC about systems behind firewall > not fit to make requests back to some other server as required in > webmention.org <http://webmention.org> . I think that JSON-LD > Secure Messaging could offer some > solutions here. > > Mozilla Open Badges already have hosted version (similar to how > pattern > used for WebMention works) as well as signed version, both > explained in > current (pre JSON-LD) spec: > https://github.com/openbadges/openbadges-specification/blob/master/Assertion/latest.md#badge-verification > > > -------- Forwarded Message -------- > Subject: Digital Signatures for Credentials > Resent-Date: Mon, 17 Nov 2014 02:32:41 +0000 > Resent-From: public-credentials@w3.org > <mailto:public-credentials@w3.org> > Date: Sun, 16 Nov 2014 21:32:16 -0500 > From: Manu Sporny <msporny@digitalbazaar.com > <mailto:msporny@digitalbazaar.com>> > To: Credentials Community Group <public-credentials@w3.org > <mailto:public-credentials@w3.org>> > > During the call last week, we touched on the last major item (digital > signatures) that needs to be aligned between the Badge Alliance > technology stack and the Credentials technology stack. Like all > technology, there are upsides and downsides to each approach. I > thought > I'd try and summarize them in this email. > > The Credentials technology stack[1] focuses on extensibility via > Linked > Data / JSON-LD and thus uses a digital signature mechanism that was > built for graph-based data. > > The Badge Alliance technology stack had focused on pure JSON data and > re-using the IETF's JOSE digital signature stack. I've written about > Digital Bazaar's concerns with JOSE before[2]. > > In general, both technologies allow a developer to: > * Digitally sign data > * Verify digitally signed data > * Express public/private keypairs > * Encrypt and decrypt data in message envelopes > > In this respect, neither technology is that different from what XML > Digital Signatures enables one to do. > > Both SM and JOSE use JSON as the basic container format due to JSON's > popularity with developers. When comparing the SM vs. JOSE technology > stacks, here are some of the key pros/cons: > > JSON-LD Secure Messaging Pros: > * Clear-text signatures (easier to see/debug what's going on) > * Works with any RDF syntax (N-Quads, TURTLE, etc.) > * Ensures discoverability of public keys via the Web > * Simpler interface for Web developers > * Extensible message format due to JSON-LD > * Designed to integrate cleanly with HTTP Signatures > * Identified as a need for both the Social Web WG and > Web Annotations WG due to dependence on JSON-LD > > JSON-LD Secure Messaging Cons: > * Not an official standard yet > * Graph Normalization algorithm is hidden from developers, but > very complex > > JOSE Pros: > * First mover advantage > * Already an IETF standard with thorough security review > * More software libraries exist for JOSE > > JOSE Cons: > * Signed data is an opaque blob, which is very difficult to try and > debug > * Fairly difficult to use for Web developers due to exposing too much > complexity > * Format is not extensible, requires coordination through IETF > * No standardized public key discoverability mechanism > > The biggest downside with the SM approach is that it's not a W3C > standard yet and that will take some time (1-2 years). The > technology is > done and there are multiple interoperable implementations out > there, so > we're not concerned about it not getting through the standardization > process once it enters the process. With the recent hallway discussion > at W3C TPAC, we feel that we should be able to get the minimum > number of > W3C member votes necessary to take the specs REC-track. > > So, with that introduction - are there any thoughts on SM vs. > JOSE? Does > anyone feel that strongly one way or the other? Any pros/cons that are > not in the list above that should be? > > -- manu > > [1] http://opencreds.org/specs/source/roadmap/#technology-stack > [2] > http://lists.w3.org/Archives/Public/public-webpayments/2013Aug/0004.html > > -- > Manu Sporny (skype: msporny, twitter: manusporny, G+: +Manu Sporny) > Founder/CEO - Digital Bazaar, Inc. > blog: The Marathonic Dawn of Web Payments > http://manu.sporny.org/2014/dawn-of-web-payments/ > > > > -- Dave Longley CTO Digital Bazaar, Inc. http://digitalbazaar.com
Received on Wednesday, 19 November 2014 16:38:51 UTC