Re: Significantly reducing headers footprint

Hi Patrick,

On Mon, Jun 11, 2012 at 02:50:45PM -0400, patrick mcmanus wrote:
> On 6/11/2012 5:16 AM, Willy Tarreau wrote:
> >On Sun, Jun 10, 2012 at 04:39:37PM -0700, Roberto Peon wrote:
> >    =>  With better request reordering, we could have this :
> >
> >       11 Accept: */*
> >      109 Accept: image/png,image/*;q=0.8,*/*;q=0.5
> >        4 Accept: text/css,*/*;q=0.1
> >        3 Accept:
> >text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
> >
> >>Achieving this seems difficult? How would we get a reording to occur in a
> >>reasonable manner?
> >I don't think it's that difficult, but I'm not a browser developer and I'm
> >sure they're facing a huge amount of complex issues. For instance, maybe
> >it's not always possible to fetch all images at a time, or to fetch css
> >first then images. I must say I don't know :-/
> >
> 
> reordering adds latency in order to discover the full set of things that 
> should be reordered when you're doing streamed parsing.

This is more or less what I was suspecting, but of course it's better
with your explanation !

> This is a 
> situation SPDY actually improves - in HTTP/1 you might not send a 
> resource request as soon as you discover it (adding latency) in order to 
> speculatively preserve bandwidth for resources you hope will be 
> discovered "soon".. in spdy you can just send them all asap with 
> appropriate priorities attached to manage the bandwidth - reintroducing 
> a motivation for queuing is undesirable imo.
> 
> So this seems like an un-necessary constraint to solve a situation that 
> gzip windows already effectively address.

OK thanks for your insights !

Willy

Received on Monday, 11 June 2012 22:10:39 UTC