W3C home > Mailing lists > Public > public-webcrypto@w3.org > June 2012

Re: [W3C Web Crypto WG] Deciding if we need a discovery mechanism

From: Ryan Sleevi <sleevi@google.com>
Date: Thu, 7 Jun 2012 15:12:21 -0700
Message-ID: <CACvaWvb4eETD6ocq2crrygHbRVWezVAwfP=rxJWgbN2bRQX0eQ@mail.gmail.com>
To: David Dahl <ddahl@mozilla.com>
Cc: Mark Watson <watsonm@netflix.com>, GALINDO Virginie <Virginie.GALINDO@gemalto.com>, public-webcrypto@w3.org
On Thu, Jun 7, 2012 at 8:51 AM, David Dahl <ddahl@mozilla.com> wrote:

> ** this message was stuck in my outbox for a few days because of a
> computer failure **
> I have been playing around with an algorithm discovery API idea in a
> github repo.
> https://github.com/daviddahl/web-crypto-ideas/blob/master/algorithm-discovery.js
> I think throwing "UnsupportedAlgorithmError" from a call with unsupported
> algorithm IDs is a must.
> cheers,
> David

Hi David,

Thanks for taking a stab at this. I think the choice of discovery API, and
the behaviour such as whether it throws an error/exception, must be very
closely tied with how the final API looks. Thus, it's hard to evaluate this
proposal without also understanding how you might see the use of RSA-OAEP
and the use of ECDH-ES working, using crypto.pk.algorithms.keyenc as an

The reason I mention this is that I'm trying to think of how a web
developer might extend the API locally when a given user agent does not
support a desired algorithm/cryptographic primitive. Let us say, for
example, that a U-A is able to implement all of the algorithms except for
ECDH-ES. The web developer may wish to implement ECDH-ES purely within
JavaScript for these cases (ignoring the bignum constraints, etc). How they
do so depends on what the API looks like.

If algorithms are objects, you might imagine that it would be as simple as:

var ecdh_alg = window.crypto.pk.keyenc.ecdh_es || my_local_ecdh_es;
var my_key_encryptor = ecdh_alg(arg1, arg2, arg3);

If all algorithms have to be pumped through a singular function, then it
becomes a bit more complex:

function my_window_crypto_wrapper(...) {
  var my_key_encryptor = undefined;
  try {
    my_key_encryptor = window.crypto.pk.create("ECDH-ES", arg1, arg2, arg3,
  } catch (error) {
    if (error instanceof UnsupportedAlgorithmError) {
      my_key_encryptor = my_local_ecdh_es(arg1, arg2, arg3, ...);
  return my_key_encryptor;

The above becomes even more complex when the web application may have
multiple local implementations, since it now needs to also handle switching
and function invocation dependent on the algorithm.

The first approach, being object/prototype oriented, also lends itself to
more organic discovery of algorithms via JavaScript type
inspection/reflection. eg:

if (window.crypto.pk.keyenc.ecdh_es) { ... }
for (algorithm in window.crypto.pk.keyenc) { ... }

That's not to say exceptions are not useful - I imagine exceptions may be
necessary for handling things like whether or not a key can be used for a
particular purpose (which is something that local JS could not emulate) -
but for things that may be able to be emulated locally, I think exceptions
may just add to the complexity of doing so.


> ----- Original Message -----
> From: "Mark Watson" <watsonm@netflix.com>
> To: "GALINDO Virginie" <Virginie.GALINDO@gemalto.com>,
> public-webcrypto@w3.org
> Sent: Monday, June 4, 2012 12:48:00 PM
> Subject: Re: [W3C Web Crypto WG] Deciding if we need a discovery mechanism
> For completeness, the other option for algorithm discovery is
> trial-and-error (unsupported algos return an "unsupported" error and the
> script tries a different one).
> The disadvantages of this are obvious.
> The advantage is that it adds no additional complexity to the API.
> The disadvantages are mitigated if the set of alternative algorithms that
> a *script may wish to use* is small, even if the set of possible algorithms
> that a client may support is large. In practice, sites probably support
> only a small set of different algorithms. Probably the minimal set needed
> to get the browser/device coverage they want. So the trial-and-error
> process is probably limited to a set of 2 or three "tries".
> …Mark
> On May 15, 2012, at 9:10 AM, Jarred Nicholls wrote:
> On Tue, May 15, 2012 at 10:59 AM, GALINDO Virginie <
> Virginie.GALINDO@gemalto.com<mailto:Virginie.GALINDO@gemalto.com>> wrote:
> Dear all,
> Some people mentioned that a webapp may be able to discover the algorithms
> supported the environment it is running in, thus identifying algorithms
> available thanks to the Web Crypto API. There are several means to do that
> (1) either by an actual discovery mechanism sending back the entire list of
> algorithms, (2) or by creating ‘cipher suite’. In addition to that, we may
> be able to mandate a minimum set of algorithms that should be supported
> when implementing the API.
> I would like to get your feedback on that one, (or any alternative
> proposal).
> Regards,
> Virginie
> gemalto
> I'd say a combination of 1 and 2 are satisfying.  If there's a way to use
> 1 and then create/discover a particular permutation of a 'cipher suite', as
> well as a way to request it via a cipher suite identifying string for
> example, then I think we get the best of both worlds in terms of robustness
> and discoverability.
> Once again, it is my opinion that mandating a minimum set of algorithms is
> aggressive.  While it does have its benefits in the short term like
> predictability for authors and establishing a deterministic test suite for
> the WG, there are a number of reasons I am against it:
>   *   It could cause bad decisions in the API that take advantage of
> knowing a particular algorithm is meant to be implemented, or even worse,
> are decided solely for a particular algorithm.  The temptation could get
> the best of us ;)
>  *   It means that vendors will have to come to consensus (assuming they
> all care about being compliant to the spec) on that set of algorithms, and
> that can be a slippery slope and a time sink because there might be
> differences in requirements between some vendors and/or the locales they
> support.
>  *   Authors will undoubtedly write code that is not future proof, always
> assuming certain support with no feature detection. This will make it
> harder for vendors to deprecate & remove algorithm support in the future
> when e.g. the algorithm is no longer considered strong enough in the
> post-quantum cryptography world.  If we can nail down easy discoverability
> and feature detection, then library authors will code defensively (as they
> should) without a lot of effort.
> Thanks,
> Jarred
Received on Thursday, 7 June 2012 22:12:50 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:17:10 UTC