- From: Ryan Sleevi <sleevi@google.com>
- Date: Wed, 19 Dec 2012 16:07:08 -0800
- To: Richard Barnes <rbarnes@bbn.com>
- Cc: "public-webcrypto@w3.org Working Group" <public-webcrypto@w3.org>
On Wed, Dec 19, 2012 at 10:51 AM, Richard Barnes <rbarnes@bbn.com> wrote: > > Sorry it's taken a few days to reply, I've been pondering this off and on while coding... > > Could you elaborate a little on how this lock in works? I'm not getting the "???" parts in this story: > 1. Browser provides default algorithms / parameters > 2. Web app developer builds application that does / doesn't do ??? > 3. Browser changes default algorithms / parameters > 4. Web app breaks because ??? > > What I was trying to argue originally is that: > -- Breakage only happens if the app has cached encrypted/signed data and hasn't cached the algorithms / params Wrong. Breakage happens any time a server is configured to presume that the defaults are some fixed value. Consider a server that assumes that because RSA-OAEP has an mgf of mgf1-sha1, that the messages it will receive have that mgf. > > -- However, for cached data to be useful at all, in most cases, the app needs to cache something anyway (e.g., an IV/nonce) > -- So if we wrap the algorithms together with those things, then they'll get cached, and there's no breakage You're also failing to consider that defaults have value as "recommendations", which is something that I've repeatedly explained is a topic that we should strive to avoid. If the spec makes mgf-sha1 the default, then it can never change. Ever. Any change means existing web pages will break, and quite frankly, no user agent wants to break the Internet. So no UA will support/implement future versions that, when SHA1 is declared "totally broken", define the new default as mgf-sha256. Because it would break every app that assumed that unspecified == mgf-sha1. So once you declare a default, it's fixed, for the entire future of the Internet, for as long as that API exists. You can't version bump it - you have to full on replace it. That's not a good strategy. Again, if libraries want to do this default, I'm all for it. Go right ahead. This is the general problem that is better served by the library, because then user agents do not need to worry about such things. > > > If I'm not convincing you on algorithms, how about at least on ephemeral parameters? Things like: > -- AesCbcParams.iv > -- AesCtrParams.counter > -- Pbkdf2Params.salt Definitely not. There is no way that a user agent can 'know' whether AesCbcParams.iv or AesCtrParams.counter were 'safe' values (eg: values that have not been used before with the associated key). Such a behaviour is entirely dependent on the user of the Web Crypto API and the protocol(s) they may be implementing. So there is no way any 'secure' application would/should do anything short of explicitly specifying what the value should be. So the only developers it helps are those who don't understand crypto, and it helps them by encouraging or making it easier to do things insecurely. > > All of these things need to be cached by the application anyway, and they're can all safely be set to random values. So couldn't we just have the API generate them instead of forcing the developer to call getRandomValues themselves? Again, I think this is something that should be addressed at the high-level layer - the same as it is with every existing cryptographic API. Note, I'm not arguing from tradition, but I'm trying to point out that the supposed benefits aren't, and so consistency in concepts should be the driving factor here. Something like SJCL, KeyCzar, or NaCl for JS can totally implement defaults - whether for parameters or for algorithms - in a manner that is secure for their usage. So, too, could a "high-level" API, if the WG so decides to do one. But mixing in defaults to the low-level bakes in assumptions that prevent user agents from reasonably changing, and, I would posit, does not save any meaningful code in the long run.
Received on Thursday, 20 December 2012 00:07:35 UTC