- From: Ryan Sleevi <sleevi@google.com>
- Date: Wed, 29 May 2013 19:44:37 -0700
- To: Jeffrey Walton <noloader@gmail.com>
- Cc: WebCrypto Comments <public-webcrypto-comments@w3.org>
- Message-ID: <CACvaWvY4Xsvs+xzrEW+tGbePTCbjR+v7BJ76Jt+5fULSDSrwig@mail.gmail.com>
On May 29, 2013 7:22 PM, "Jeffrey Walton" <noloader@gmail.com> wrote: > > Hi All, > > Is there any interest in providing APIs to validate parameters? None here. I don't see any useful way this can be implemented in a cross-platform, cross-library way, especially taking into consideration the desire to eventually support alternative (read: hardware) cryptographic implementations. I consider this to be something a UA could do under the hood if it wanted (though none of the standard APIs do), and too low-level to expose via the API directly (ala BigNum). > > One of the things I try and teach my guys is that they must validate > cryptographic parameters; and they cannot apply a secret if validation > fails. Unvalidated keys could have flaws that allow for recovery of > the secret. For example, if an RSA public key does not validate, then > it should not be used to transport a secret. Define your threat model. Where do such keys come from. Why are they not trusted - and yet being used to transfer secrets? Seems a conflicting statement. > > GnuPG is a somewhat special case since its key pair is composed of > Lim-Lee primes. In this case, we can apply a "small prime" test; but > not a "strong prime" test. > > Taking both standard practices and GnuPG into consideration, I believe > that means the API should accept a level for validation. Arbitrarily, > levels should probably follow the model set by certificate classes (1 > = low, N = high), so a Level 0 would mean no validation; Level 1 > validation would perform the small prime test; Level 2 would perform > the strong prime test (plus level 1), Level 3 could provide the tests > that take prolonged time (plus Level 2). > > Its probably worth mentioning that local keys should be validated too > - for example, after loading a private key from a key store. Why? What's the threat model for this case? > So tests > like Jacobi and Miller-Rabin are also of interest. > > Its common to offer validation and levels in other libraries. For > example, the Crypto++ library allows a developer to validate an object > and specify a level > ( http://www.cryptopp.com/docs/ref/class_crypto_material.html#aaa7d67d0c12712de0e33713c73f5b718 ): > > level denotes the level of thoroughness: 0 - using this object > won't cause > a crash or exception (rng is ignored) 1 - this object will probably function > (encrypt, sign, etc.) correctly (but may not check for weak keys and such) > 2 - make sure this object will function correctly, and do > reasonable security > checks 3 - do checks that may take a long time > > Offering validation is consistent with the low level API that is > taking shape with the WebCrypto API. Disagree - seems more like BigNum. > > Finally, other components, such as JOSE (JSON Web Algorithms (JWA), > JSON Web Encryption (JWE), JSON Web Key (JWK), and JSON Web Signature > (JWS)) does not even mention parameter validation. Someone has to do > it, and it appears others are expecting someone else to do it. This is > the Bystander Effect, and its a known problem in Security Engineering > (cf, Gutmann's Engineering Security, p. 82). > > Jeff >
Received on Thursday, 30 May 2013 02:45:08 UTC