[Bug 25985] WebCrypto should be inter-operable

https://www.w3.org/Bugs/Public/show_bug.cgi?id=25985

--- Comment #24 from Ryan Sleevi <sleevi@google.com> ---
(In reply to Boris Zbarsky from comment #23)
> Sure.  What it looks like to me is that the current setup more or less
> _forces_ sites to do that.
> 
> Which, again, is why I keep asking for an example of how a well-authored
> site that doesn't want to do this will use this API and how it will know to
> do that correct thing.  That's where having some idea of what set of things
> one can expect a conforming implementation to implement (again, not all of
> them, but at least _some_ of them) seems like it could be useful.

I feel like I've given examples, but I suspect we're not communicating well on
this.

An application that has *specific* cryptographic requirements can attempt the
operations, detect a NotSupportedError, and surface to the user that their user
agent doesn't support the necessary algorithms.

An application with less-stringent requirements - let's use a concrete example,
of say, https://code.google.com/p/end-to-end/ - can use WebCrypto when it's
available and supports the desired algorithm (speed! correctness!), and fall
back to Javascript polyfills when not. They can make this decision
per-application, depending on whether or not they feel the Javascript
implementation matches their necessary security requirements.

An application like https://developer.mozilla.org/en-US/Persona can similarly
use detection-and-polyfill, IF the security requirements are suitable.

An application such as the 'media streaming service over HTTP that wants to
prevent passive, but not active, attackers' can likewise make tradeoffs.
Perhaps they prefer WebCrypto for the key-storage and the browser-mediated key
unwrap, and thus chose to not make the service available for UAs that don't
support their necessary security profile, rather than polyfill. Or they might
choose to polyfill via iframe. Or they might be (mistakenly) more concerned
about the user of the device not having access to the keys, rather than the
observer on the network. Or they might require hardware backed keys, much like
they might require specific CDM modules or support for specific plugins, and
thus interoperability is already out the window.

> 
> > they know that every UA *COULD* implement a given algorithm
> 
> That's not actually obvious from the spec as it currently stands.  Again,
> right now a UA that doesn't implement any algorithms at all is considered as
> conformant as any other UA.

Yes.

That is because, for the past two years, conformance has been separated into
APIs vs algorithms.

I suspect this might also be some confusion on the list of algorithms included
in the spec. As has been discussed on past calls, and, were it not for the fact
that the current spec format (WebIDL.xsl) is unwieldy, would have already
happened, is that *every* algorithm would have been divided into a separate
REC-track document.

That would not change the thing you object to, but would make it clearer to
readers that this is a "by design" and not "by accident" feature.

In this respect, it is similar (as I understand) to how the canvas element
provides a consistent API/tag, but there are different possible contexts ("2d",
"webgl"). Supporting the canvas tag does not, AFAICT, require that one support
WebGL.

> 
> We agree that specifying what happens when a UA does implement an algorithm
> is worthwhile.  That part is done; I'm not worrying about that part.  ;)

-- 
You are receiving this mail because:
You are on the CC list for the bug.

Received on Thursday, 5 June 2014 19:19:10 UTC