W3C home > Mailing lists > Public > public-webauthn@w3.org > May 2017

Re: [webauthn] Consider requiring canonical CBOR throughout

From: Adam Langley via GitHub <sysbot+gh@w3.org>
Date: Tue, 23 May 2017 19:28:00 +0000
To: public-webauthn@w3.org
Message-ID: <issue_comment.created-303506382-1495567679-sysbot+gh@w3.org>
Every bit of unneeded flexibility that you can eliminate in the encoding is good, so please ban indeterminate lengths and unsorted maps if that's all you feel that you can.

Minimal lengths in a variable-length encoding do require either knowing the value before writing it or else moving after-the-fact. For a host, shifting a few bytes in L1 cache at human-interaction rates is trivial. A small device might choose to calculate the lengths beforehand. That's just the cost for the smaller size of variable-length integers.

However, encoding strictness is not purely a sliding scale: a canonical encoding is qualitatively different because it gives a bijection between messages and encodings. This feature makes testing more effective. If there's only a single encoding for a given structure of message, and you've tested that encoding, then your implementation works. In contrast, consider a world with non-canonical length encodings:

One embedded implementation is running on a 16-bit processor and decides to use a uint16 to hold lengths because no message that it cares about will exceed 64KiB. Another implementation decides to simplify serialisation by writing all lengths as uint32's, padding with zeros as needed. Both will likely work with common host implementations that are "liberal in what they accept but conservative in what they send" but fail when talking to each other.

A canonical encoding avoids these expensive interop issues.

BoringSSL is strict when parsing (although it's DER, not CBOR) in order to improve the ecosystem. We have caught a lot of those sorts of issues early, while they are still cheap to fix. That's what I meant when I said &ldquo;a single common implementation that is strict will ensure that the whole ecosystem remains healthy and interoperable&rdquo;. (Our [GREASE](https://tools.ietf.org/html/draft-ietf-tls-grease-00) efforts have a similar motivation.)

Strictness in the parser can also save lots or complexity and issues elsewhere. For example, a failure to be strict about DER lengths in NSS resulted in a [complete break of signature validation](https://www.imperialviolet.org/2014/09/26/pkcs1.html), and thus a complete break of TLS, and lived in the code for many years. Strictness in the DER parser has also allowed us to remove workarounds elsewhere for cases where bugs had crept into the ecosystem. The payoff from a few checks at the bottom has been significant.

-- 
GitHub Notification of comment by agl
Please view or discuss this issue at https://github.com/w3c/webauthn/issues/455#issuecomment-303506382 using your GitHub account
Received on Tuesday, 23 May 2017 19:28:07 UTC

This archive was generated by hypermail 2.4.0 : Tuesday, 5 July 2022 07:26:26 UTC