Re: [W3C Web Crypto WG] Deciding if we need a discovery mechanism

----- Original Message -----
> From: "Mitch Zollinger" <mzollinger@netflix.com>
> To: public-webcrypto@w3.org
> Sent: Thursday, June 7, 2012 7:24:34 PM
> Subject: Re: [W3C Web Crypto WG] Deciding if we need a discovery mechanism
> 

> In order to support pre-provisioned keys, hardware crypto, etc. would
> it
> be possible to use such APIs on a per-key basis?
> 

I think so. We do need to figure out just how flexible this mechanism should be. Perhaps the key handle object you get back has properties like: 

kh.mode
kh.padding
kh.algorithm



>      KeyHandle kh = getKey("MyKey");
>      // make sure it's either a 128 bit or 256 bit key
>      // and supports CBC mode with PKCS5 padding
>      if(kh.discover("AES256/CBC/PKCS5") ||
>      kh.discover("AES128/CBC/PKCS5"))
>          // do your thing!
>      else {
>          // error
>      }
> 
> Or are these APIs meant to handle only discovery of algorithms for
> runtime generated keys?
> 
This is just a stab in the dark. Your feedback is really helpful here.

Cheers,

David

> Mitch
> 
> >
> >
> > I think throwing "UnsupportedAlgorithmError" from a call with
> > unsupported algorithm IDs is a must.
> >
> > cheers,
> >
> > David
> >
> > ----- Original Message -----
> > From: "Mark Watson"<watsonm@netflix.com>
> > To: "GALINDO Virginie"<Virginie.GALINDO@gemalto.com>,
> > public-webcrypto@w3.org
> > Sent: Monday, June 4, 2012 12:48:00 PM
> > Subject: Re: [W3C Web Crypto WG] Deciding if we need a discovery
> > mechanism
> >
> > For completeness, the other option for algorithm discovery is
> > trial-and-error (unsupported algos return an "unsupported" error
> > and the script tries a different one).
> >
> > The disadvantages of this are obvious.
> >
> > The advantage is that it adds no additional complexity to the API.
> >
> > The disadvantages are mitigated if the set of alternative
> > algorithms that a *script may wish to use* is small, even if the
> > set of possible algorithms that a client may support is large. In
> > practice, sites probably support only a small set of different
> > algorithms. Probably the minimal set needed to get the
> > browser/device coverage they want. So the trial-and-error process
> > is probably limited to a set of 2 or three "tries".
> >
> > …Mark
> >
> >
> > On May 15, 2012, at 9:10 AM, Jarred Nicholls wrote:
> >
> > On Tue, May 15, 2012 at 10:59 AM, GALINDO
> > Virginie<Virginie.GALINDO@gemalto.com<mailto:Virginie.GALINDO@gemalto.com>>
> >  wrote:
> > Dear all,
> >
> > Some people mentioned that a webapp may be able to discover the
> > algorithms supported the environment it is running in, thus
> > identifying algorithms available thanks to the Web Crypto API.
> > There are several means to do that (1) either by an actual
> > discovery mechanism sending back the entire list of algorithms,
> > (2) or by creating ‘cipher suite’. In addition to that, we may be
> > able to mandate a minimum set of algorithms that should be
> > supported when implementing the API.
> >
> > I would like to get your feedback on that one, (or any alternative
> > proposal).
> >
> > Regards,
> > Virginie
> > gemalto
> >
> >
> > I'd say a combination of 1 and 2 are satisfying.  If there's a way
> > to use 1 and then create/discover a particular permutation of a
> > 'cipher suite', as well as a way to request it via a cipher suite
> > identifying string for example, then I think we get the best of
> > both worlds in terms of robustness and discoverability.
> >
> > Once again, it is my opinion that mandating a minimum set of
> > algorithms is aggressive.  While it does have its benefits in the
> > short term like predictability for authors and establishing a
> > deterministic test suite for the WG, there are a number of reasons
> > I am against it:
> >
> >    *   It could cause bad decisions in the API that take advantage
> >    of knowing a particular algorithm is meant to be implemented,
> >    or even worse, are decided solely for a particular algorithm.
> >     The temptation could get the best of us ;)
> >    *   It means that vendors will have to come to consensus
> >    (assuming they all care about being compliant to the spec) on
> >    that set of algorithms, and that can be a slippery slope and a
> >    time sink because there might be differences in requirements
> >    between some vendors and/or the locales they support.
> >    *   Authors will undoubtedly write code that is not future
> >    proof, always assuming certain support with no feature
> >    detection. This will make it harder for vendors to deprecate&
> >     remove algorithm support in the future when e.g. the algorithm
> >    is no longer considered strong enough in the post-quantum
> >    cryptography world.  If we can nail down easy discoverability
> >    and feature detection, then library authors will code
> >    defensively (as they should) without a lot of effort.
> >
> > Thanks,
> > Jarred
> >
> >
> >
> 
> 

Received on Friday, 8 June 2012 03:26:34 UTC