W3C home > Mailing lists > Public > ietf-http-wg@w3.org > April to June 2012

Re: Significantly reducing headers footprint

From: patrick mcmanus <pmcmanus@mozilla.com>
Date: Mon, 11 Jun 2012 14:50:45 -0400
Message-ID: <4FD63E05.5060707@mozilla.com>
To: ietf-http-wg@w3.org
On 6/11/2012 5:16 AM, Willy Tarreau wrote:
> On Sun, Jun 10, 2012 at 04:39:37PM -0700, Roberto Peon wrote:
>     =>  With better request reordering, we could have this :
>        11 Accept: */*
>       109 Accept: image/png,image/*;q=0.8,*/*;q=0.5
>         4 Accept: text/css,*/*;q=0.1
>         3 Accept:
> text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
>> Achieving this seems difficult? How would we get a reording to occur in a
>> reasonable manner?
> I don't think it's that difficult, but I'm not a browser developer and I'm
> sure they're facing a huge amount of complex issues. For instance, maybe
> it's not always possible to fetch all images at a time, or to fetch css
> first then images. I must say I don't know :-/

reordering adds latency in order to discover the full set of things that 
should be reordered when you're doing streamed parsing. This is a 
situation SPDY actually improves - in HTTP/1 you might not send a 
resource request as soon as you discover it (adding latency) in order to 
speculatively preserve bandwidth for resources you hope will be 
discovered "soon".. in spdy you can just send them all asap with 
appropriate priorities attached to manage the bandwidth - reintroducing 
a motivation for queuing is undesirable imo.

So this seems like an un-necessary constraint to solve a situation that 
gzip windows already effectively address.
Received on Monday, 11 June 2012 18:51:14 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 17:14:00 UTC