Re: something I don't get about the current plan...

Hi Stephen,

I think that's a great question.

The underlying assumption seems to be that the performance (and other?) benefits of HTTP/2 will lure sites into deploying TLS. Other things could also help, of course -- e.g. better administrator experience in deploying certs on the server, but that's out of scope for us. 

In short, HTTP/2 is being positioned as a gigantic carrot. Because the incentives are lined up (the person who needs to install the cert is getting the benefit of HTTP/2), the theory is that it's not like the other cases.

However, it's still making an assumption that enough people will want those benefits to go through the pain of deploying TLS.

Opportunistic encryption is also a means of addressing this issue; however, there seems to be a lot of doubt about how its introduction would affect the Web, whereas the current approach ("HTTPS Everywhere", to steal a phrase from the EFF) has more well-understood properties.

In the current plan, opp encryption may still have a place, if adoption of HTTP/2-over-TLS-over-HTTPS turns out to be very low.

So, I'd like to hear from those who don't like the current plan; would opp encryption (in a nutshell, HTTP/2 for http:// URIs over TLS without server authentication) help or hurt?

Also, I'm wondering what people (both sides) would think if we allowed http/2 for http:// URLs (with or without opp encryption) for .local and RFC1918 addresses, to ease the IoT / printer cases.

Cheers,



On 18/11/2013, at 3:09 AM, Stephen Farrell <stephen.farrell@cs.tcd.ie> wrote:

> 
> So the current plan is for server-authenticated https
> everywhere on the public web. If that works, great. But
> I've a serious doubt.
> 
> 30% of sites use TLS that chains up to a browser-trusted
> root (says [1]). This plan has nothing whatsoever to say
> (so far) about how that will get to anything higher.
> 
> Other aspects of HTTP/2.0 appear to require reaching a
> "99.9% ok" level before being acceptable, e.g. the port
> 80 vs not-80 discussion.
> 
> That represents a clear inconsistency in the arguments for
> the current plan. If its not feasible to run on e.g. port
> 100 because of a 10% failure rate, then how is it feasible
> to assume that 60% of sites will do X (for any X, including
> "get a cert"), to get to the same 90% figure which is
> apparently unacceptable, when there's no plan for more-X
> and there's reason to think getting more web sites to do
> this will in fact be very hard at best?
> 
> I just don't get that, and the fact that the same people are
> making both arguments seems troubling, what am I missing
> there?
> 
> I would love to see a credible answer to this, because I'd
> love to see the set of sites doing TLS server-auth "properly"
> be much higher, but I have not seen anything whatsoever about
> how that might happen so far.
> 
> And devices that are not traditional web sites represent a
> maybe even more difficult subset of this problem. Yet the
> answer for the only such example raised (printers, a real
> example) was "use http/1.1" which seems to me to be a bad
> answer, if HTTP/2.0 is really going to succeed HTTP/1.1.
> 
> Ta,
> S.
> 
> PS: In case its not clear, if there were a credible way to
> get that 30% to 90%+ and address devices, I'd be delighted.
> 
> PPS: As I said before, my preference is for option A in
> Mark's set - use opportunistic encryption for http:// URIs
> in HTTP/2.0. So if this issue were a fatal flaw, then I'd
> be arguing we should go to option A and figure out how to
> handle mixed-content for that.
> 
> [1] http://w3techs.com/technologies/overview/ssl_certificate/all
> 

--
Mark Nottingham   http://www.mnot.net/

Received on Monday, 18 November 2013 04:05:20 UTC