W3C home > Mailing lists > Public > ietf-http-wg@w3.org > January to March 2015

Re: Preliminary agenda for Dallas

From: Yoav Nir <ynir.ietf@gmail.com>
Date: Tue, 10 Mar 2015 21:19:17 +0200
Cc: Mark Nottingham <mnot@mnot.net>, HTTP Working Group <ietf-http-wg@w3.org>
Message-Id: <7E163273-2264-4C82-A148-66972E4C3CFE@gmail.com>
To: Poul-Henning Kamp <phk@phk.freebsd.dk>

> On Mar 10, 2015, at 10:53 AM, Poul-Henning Kamp <phk@phk.freebsd.dk> wrote:
> 
> --------
> In message <13FE6D6D-BB19-455D-95C5-073A299009DE@mnot.net>, Mark Nottingham wri
> tes:
> 
>>> Who is going to open issues based on Bob Briscoes critique ?
>> 
>> Martin will apply editorial suggestions as he's able 
>> (constrained by where we're at in the process); anything more 
>> will need to be an errata, or held for the next update.
> 
> In other words: The WG is going to totally ignore the substance of
> the first competent, comprehensive and thorough outside review
> HTTP/2.0 has ever received.
> 
> Show of hands:  How many think that is the wrong thing to do ?

I’m keeping my hand down.

The HTTP/2 standard was written by a relatively small bunch of people. There have probably been around two dozen active participants in the process. Nor is this group representative of all implementers or HTTP or of anything but a small subset of deployers. Furthermore, if Bob Briscoe had joined in earlier, some things might have been different, although I don’t really think so. There were other people pushing for more extensibility. Testing of this protocol have been done at some scale, but only on a few websites, that collectively are not representative of “the web”.

A new version of HTTP is bound to come from such a small and non-representative group. The web is just too big. As time went by, the discussions in the group became less fruitful. That is inevitable. With the small group and small pool of test subject, we ran out of new information to act on. The next stage for this and any other protocol is to try it in different environments. And that is facilitated by publishing a “Proposed Standard”.

In a year’s time, we’ll have several independent clients and a lot of servers. People with personal blogs, CDNs, firewalls, home routers, news sites, blogs and online businesses will attempts to use HTTP/2. Not all of them, but a much more representative sample than we have today. And it’s not like a failure of HTTP/2 will break the Internet. We have a safety net in HTTP/1. HTTP/1 is fine. It works great. It works so great, that it supplanted many protocols such as FTP.

In a year’s time, we might get some “war stories” from people who have tried HTTP/2. Maybe they’ll love it. Maybe they’ll think it needs to be extended. Maybe they’ll think we need something like an interactive bias so that the first bytes of a stream are given higher priority, which will allow web pages to settle in size faster. Maybe they’ll revert to HTTP/1 and have good explanations about why HTTP/2 was inappropriate. 

I think that kind of input is what we need to consider before starting work on a -bis document or HTTP/3 or whatever. I don’t think that thought experiments are going to add new useful insights. So IMO the best course of action is to publish HTTP/2 as is, and wait for feedback from the field, especially the parts of the field that were not represented in this discussion.

Yoav
Received on Tuesday, 10 March 2015 19:19:47 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 17:14:43 UTC