Re: API for Validating Crypto Parameters

On May 29, 2013, at 11:20 PM, Jeffrey Walton <noloader@gmail.com> wrote:

> On Wed, May 29, 2013 at 10:44 PM, Ryan Sleevi <sleevi@google.com> wrote:
>> 
>> On May 29, 2013 7:22 PM, "Jeffrey Walton" <noloader@gmail.com> wrote:
>> 
>>> ...
>>> One of the things I try and teach my guys is that they must validate
>>> cryptographic parameters; and they cannot apply a secret if validation
>>> fails. Unvalidated keys could have flaws that allow for recovery of
>>> the secret. For example, if an RSA public key does not validate, then
>>> it should not be used to transport a secret.
>> 
>> Define your threat model. Where do such keys come from. Why are they not
>> trusted - and yet being used to transfer secrets? Seems a conflicting
>> statement.
> What I have in mind is (1) weak keys due to bad generators (Debian,
> NetBSD, OpenSSL, etc) and (2) weak keys due to bad algorithms and/or
> lack of validation. Not everyone uses hardware rngs to seed
> /dev/{u}random, not everyone uses "well known" libraries for
> generation (confer, the small percentage of non-standard public
> exponents), and not everyone performs rigorous testing before
> publishing their public keys.
> 

What is the algorithm you have in mind for testing these properties?  You're not really going to get good entropy measurement on a 16-byte key.

--Richard



>>> GnuPG is a somewhat special case since its key pair is composed of
>>> Lim-Lee primes. In this case, we can apply a "small prime" test; but
>>> not a "strong prime" test.
>>> 
>>> Taking both standard practices and GnuPG into consideration, I believe
>>> that means the API should accept a level for validation. Arbitrarily,
>>> levels should probably follow the model set by certificate classes (1
>>> = low, N = high), so a Level 0 would mean no validation; Level 1
>>> validation would perform the small prime test; Level 2 would perform
>>> the strong prime test (plus level 1), Level 3 could provide the tests
>>> that take prolonged time (plus Level 2).
>>> 
>>> Its probably worth mentioning that local keys should be validated too
>>> - for example, after loading a private key from a key store.
>> 
>> Why? What's the threat model for this case?
> Three concerns here: (1) a key was previously stored without proper
> validation; and (2) an inadvertent transformation on the material; or
> (3) an inadvertent bit flip (storage, bus, or memory).
> 
> In all cases, I don't want to apply secrets to unvalidated parameters.
> 
> Jeff
> 

Received on Thursday, 30 May 2013 03:27:45 UTC