- From: David Chadwick <D.W.Chadwick@kent.ac.uk>
- Date: Mon, 29 Oct 2018 19:34:28 +0000
- To: public-credentials@w3.org
I dont know if this will help the discussion, but it is a bit of history that you might find interesting. When X.509 was first defined in the 1980's, it was thought that we needed to have distinguished encoding rule (DER) because the basic encoding rules (BER) allowed a data type and value to be encoded in multiple different ways. DER mandates one way only. The theory was that a recipient, on receiving a signed data object (like an X.509 certificate) would validate the signature on the received binary data, then decode the binary data object and present it to the application (as data types and values) for processing. Once the application had finished its processing, it might want to forward the original signed object to another entity (with or without accompanying signed or unsigned data). This second binary encoding by the application, if it used BER, would invariably change the binary data and invalidate the original signature. Hence DER was born. Some 20 years later it was subsequently stated at an X.509 meeting that we had got it wrong. DER never needed to be invented. The recipient should have stored the incoming signed binary data object (X.509 certificate), and if it wanted to subsequently forward this to another entity, it should have used the binary data "as is" without altering it. Any decoding that had been done on the signed data, so that the application could process the data, should only have been needed for the application processing, and should not have been used for re-encoding the original signed data object. Canonicalization sounds a bit like DER to me. David On 29/10/2018 17:07, Chris Boscolo wrote: > > Manu, > Regarding your comment about the Canonicalization requirement: > > This requirement is a problem because forces a new requirement onto the > JSON parser that many like myself don't think is a good idea. > > For example, one thing we would love to see is for IoT devices to play a > role this new DID/VC world we are building. Many of these > embedded systems already have a minimal JSON parser, as well as Base64 > libraries and hardware encryption support. That means they could build > a JWT version of DID/VC over the weekend (figuratively). Requiring them > to update to a new JSON-parsing library to support this is a barrier to > adoption. > > BTW, as one who has developed protocol-level encryption software, the > comment "ability to add non-signature-destroying whitespace" makes me > cringe. It seems like it is just needlessly opening the door to a new > attack vector. > > -chrisb > > On Mon, Oct 29, 2018 at 7:36 AM Manu Sporny <msporny@digitalbazaar.com > <mailto:msporny@digitalbazaar.com>> wrote: > > > - Canonicalization requirement > > Why is the requirement a problem? You could just shove the entire VC in > a JWT, but then you lose all the benefits of canonicalization (such as > syntax-agnostic signatures, ability to protect the entire message, > ability to add non-signature-destroying whitespace, compatibility with > schema.org <http://schema.org>, etc.). > > >
Received on Monday, 29 October 2018 19:34:54 UTC