[Bug 25985] WebCrypto should be inter-operable

https://www.w3.org/Bugs/Public/show_bug.cgi?id=25985

--- Comment #28 from Henri Sivonen <hsivonen@hsivonen.fi> ---
(In reply to Ryan Sleevi from comment #26)
> (In reply to Henri Sivonen from comment #25)
> > Does any major browser these days ship different versions with different
> > sets of algorithms to different countries? If not, then it seems beside the
> > point to bring these points up. If they do, then it would be useful to be
> > specific about which algorithms get disabled in which cases by which
> > browsers.
> 
> Major browsers do, but in a way that are transparent to major browsers. This
> is because major browsers are not in the business, generally speaking, of
> shipping cryptography. They defer that to existing libraries.

Which major browsers defer to libraries that are not shipped by the same entity
that ships the browser?

Is there documentation of the shipping destination-specific differences in
algorithm availability in libraries used by major browsers?

> There's also exemptions that are applicable to browsers' use of cryptography
> (that is, as it relates to SSL/TLS) that do not apply to other uses.

OK.

> However, if your use case is "implementing a particular application
> protocol", then either you have negotiation capabilities - in which case,
> your application can work, sans polyfills - or you don't, in which case,
> you, the application author, can make a choice about whether polyfills are
> appropriate. In some cases, they are. In some cases, they are not. A
> protocol without negotiation is something that is rare in practice for
> standards bodies to produce, although I certainly admit the possibility of
> an application-specific protocol.

Suppose the protocol you implement is OpenPGP. You receive a message signed
with RSA. If you want to verify the signature, you have to have RSA signature
verification--either via Web Crypto or via a polyfill. You don't get to
negotiate an algorithm at that point even though OpenPGP support multiple
signature algorithms.

If you are implementing e.g. the TextSecure protocol (whether that particular
protocol makes sense as a part of a Web page isn't relevant to the example),
you have to have a specific set of cryptographic primitives. AFAIK, there's no
negotiation between functionally similar alternatives.

As for the point about standards bodies, I think it's not reasonable to expect
that even the majority of application-specific protocols that could potentially
use Web Crypto, if Web Crypto ended up being consistent enough a basis to write
apps on, would be standards body-specified.

> > > As more and more users
> > > disable weak algorithms, user agents would be forced to remove *all*
> > > cryptographic capabilities 
> > 
> > This assumes that it's reasonable for users to be able to disable weak
> > algorithms in a low-level API. As noted above, the alternatives are that the
> > application breaks or that the application falls back on JS-implemented
> > crypto.
> 
> But the reality is that this happens, which you seem to believe doesn't.

I believe it happens. I just think it is sufficiently unreasonable that it's
not good for the spec to treat it as a legitimate conforming thing and from the
spec perspective it would make more sense to treat it like a user
self-sabotaging other features of their self-compiled copy of an Open Source
browser.

> > The best bet is not to include weak algorithms in the first place,
> > but the nature of Web compatibility will make it hard to get rid of any
> > algorithm deemed weak subsequently.
> 
> This is irrelevant and not the point I was making. It was about algorithms
> that show future weaknesses.

Right. My point is that the dynamics of Web compatibility are such that if an
algorithm is shown weak in the future, you won't be able to make Web apps
secure by withdrawing the algorithm from the API underneath those apps.

Consider a recent example from a non-Web context: LibreSSL developers tried to
remove DES (not 3DES but DES) as insecure, but they had to put it back in,
because too many things that use OpenSSL as a non-TLS crypto library broke. At
least with *BSD systems you have a finite ports collection to test with. With
the Web, you don't have a finite collection of Web apps to test, so this
pattern can only be worse on the Web.

This is why I think it's unreasonable to design with the assumption that it
would be reasonable for browser developers, administrators or users to disable
the API availability of algorithms that have been taken into use and found weak
subsequently. I believe browser configuration simply isn't going to be a viable
recourse for users to forcible fix Web apps that user weak algorithms via the
Web Crypto API. 

> The NIST curves are standard and, without exception, the most widely adopted
> curves - in implementations and in deployment. However, people, which would
> seem to include yourself, may have a distrust of the U.S. Government, and
> thus wish to disable these curves, using alternate curves.

I don't appreciate inferring something about my trust or distrust in the U.S.
goverment from the point I made.

My point is that a policy (maybe the policy is not called FIPS, but at least in
Firefox the button that messes with crypto features is called "Enable FIPS")
that involves disabling some Web browser features as a matter of local policy
is hostile to Web compatibility and, therefore, something that Web standards,
which are supposed to enable interop, shouldn't particularly accommodate. If
there was only one actor with such a policy (the USG), it would be *possible*
to accommodate one such actor by making the browser feature set match the local
policy of the single actor. But as soon as there's anyone else who legitimately
should have more features, and there are, designing for a single actor (e.g.
the USG) is not OK. (This observation isn't a matter of trust in the USG but an
observation that you get a requirement conflict as soon as you have more than
one whitelist/blacklist of algorithms.)

FWIW, I think it's reasonable for an actor to have a local policy that say that
their servers shall only enable certain TLS cipher suites or that Web apps
hosted by them shall only use certain algorithms via the Web Crypto API. I
think it's unreasonable to subset the browser features so that the
compatibility of the browser with sites external to the actor is broken in ways
that are non-obvious to users. (The possibility of a site providing a JS
polyfill for a crypto algorithm shows how silly a policy that turns off Web
Crypto API features would be: Unapproved crypto algorithms would still end up
being used.)

I realize that there are organizations whose policies are unreasonable per the
above paragraph, but, personally, I think the WG shouldn't have to feel an
obligation to those organizations to make it easy for them to subset the API or
to bless such a subset configuration as conforming, since such subsetting is
hostile to interoperability of Web tech and the W3C should be promoting
interoperability of Web tech. That is, I think it would be fine to say that
such subsetting is non-conforming. Not in the sense of expressing a belief that
such a thing would never happen but in the sense that non-conforming
configurations should be expected not to work interoperably. In other words, if
someone deliberately seeks to make things incompatible, the WG should feel no
need to make sure such deliberate incompatibility is accommodated within the
definition of conformance.

For the rest of the Web that doesn't seek deliberate incompatibility, there is
value in having a clear definition of what the compatible feature set for
vanilla browsers is.

-- 
You are receiving this mail because:
You are on the CC list for the bug.

Received on Wednesday, 11 June 2014 08:51:21 UTC