W3C home > Mailing lists > Public > ietf-http-wg@w3.org > October to December 2013

Re: Fwd: New Version Notification for draft-nottingham-http2-encryption-02.txt

From: Brian Smith <brian@briansmith.org>
Date: Sat, 14 Dec 2013 12:40:58 -0800
Message-ID: <CAFewVt6j0yaRboARj=wpaVO2s9M6j7_za-GXLp9ZWqkFtSys8A@mail.gmail.com>
To: Stephen Farrell <stephen.farrell@cs.tcd.ie>
Cc: William Chan (陈智昌) <willchan@chromium.org>, Paul Hoffman <paul.hoffman@gmail.com>, HTTP Working Group <ietf-http-wg@w3.org>
On Sat, Dec 14, 2013 at 11:20 AM, Stephen Farrell <stephen.farrell@cs.tcd.ie
> wrote:

> Possibly a different thread really but...
>
> On 12/14/2013 05:20 AM, William Chan (陈智昌) wrote:
> > Anyhow,
> > we don't support any type of opportunistic encryption, especially
> > unauthenticated. We want people to use https://, therefore we more or
> > less only plan to support HTTP/2 for https:// URIs. Let me know if
> > this still leaves anything unclear.
>
> What that leaves unclear for me is how the current 30-40% of web
> sites that are setup for some form of TLS will suddenly become
> 99%. Without some other action on helping sites get certs, it
> just won't happen would be my prediction.
>

We need to focus our effort on that problem.

There are already at least three commercial CAs, that browsers trust, that
give away free certificates: StartCom (restricted to non-business use),
GlobalSign (restricted to open source projects), and GoDaddy (restricted to
open source projects). These CAs give away an inferior good (presumably) in
the hopes of you eventually upgrading to their non-free goods. The main
problem with these CAs' freemium models is that their decision process for
whether you qualify for the free product isn't (and cannot be) automated.
However, I believe there is an opportunity for us (browser makers in
particular, and the IETF community in general) to create a new kind of
inferior good in the certificate space that CAs (possibly other than the
ones I mentioned) may be willing to give away for free in a way that allows
CAs to be comfortable with, without jeopardizing their businesses. Note:
when I say "inferior good," I use "inferior" in the economic sense only; I
think we'd insist that such certificates have security properties at least
as good as what we already accept as the minimum in browsers today.

Even if such efforts were to fail, we still wouldn't be at the point where
completely unauthenticated encryption is the only option left. There are
other ways of authenticating servers than punting to a commercial CA. We
should make sure we have thoroughly exhausted these alternatives before
giving up.

I think its all the more puzzling when contrasted with other cases
> where people claim that we can't do X because that'd cause a problem
> for 1% of the web, but yet here you seem to be saying its ok to
> do this when it'd cause a problem for 60-70% of the web. (I don't
> recall whether or not you've made such claim William.)
>

When it comes to breaking interoperability or regressing performance, small
percentages like 1% matter. The fact that most connections web browsers
make are not encrypted+authenticated is a huge problem that needs to be
addressed with strong action, but it isn't acute in the way that a
compatibility or performance regression is.

Difficulty with certificates doesn't explain why bing.com, reddit.com,
tumblr.com, baidu.com, wikipedia.com, and other top sites aren't
HTTPS-only. Social issues (wikipedia has been very open about how politics
affects their HTTPS deployment) and performance issues are much more
serious issues, and those issues won't be properly addressed by adding
opportunistic encryption to HTTP/2.

Do third-party advertising sites (the kind whose cookies are being used to
de-anonymize users) use HTTP instead of HTTPS because they can't afford
certificates? No. Performance, scalability, the pain or migrating websites
from http:// to https:// URLs, and lack of motivation seem to be the
problems. Web browsers can encourage them to move to HTTPS by getting them
on our HSTS preload lists (so the browser "fixes" those http:// links to
https:// links automatically) and by doing other things. For example, at
Mozilla we've long had a desire to strip cookies from third-party requests
that aren't HTTPS. It seems like now is the time to figure out how to make
that work. We've already seen big advertisers make changes like this to
accomodate our recent mixed-content blocking changes. I'm confident that
such advertisers would be willing to accomodate further changes, if nudged
a little bit.


> Even if only as a backup in case that 30-40% -> 99% transition
> fails, I'd hope folks do continue working on ways to provide
> opportunistic encryption for HTTP/2.0.
>

I agree that it is reasonable to continue to explore unauthenticated
encryption options. However, I encourage people to support the efforts to
go further--to try to hit a home run instead of trying to bunt.


> On the current draft - its seems quite odd to ignore the existing
> anon-DH ciphersuites when trying to do opportunistic encryption.
>

The way cipher suites are currently negotiated in TLS, with the client
saying which cipher suites it supports and the server choosing one, suffers
from the same problem that ALPN causes for http2-tls-relaxed: the client is
telling potential MitMs whether or not they will get caught. I appreciate,
and agree with, the fundemental aims of the perpass effort. However, I
think way too much emphasis is being put on the "passive" part. We need to
remember that perfect is the enemy of the good, but at the same time it
would be unfortunate to spend a huge amount of effort trying to prevent
passive attacks while making active attacks easier to carry out.

Cheers,
Brian
-- 
Mozilla Networking/Crypto/Security (Necko/NSS/PSM)
Received on Saturday, 14 December 2013 20:41:27 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 1 March 2016 11:11:20 UTC