On Mon, Jun 25, 2012 at 8:33 PM, Mark Nottingham <mnot@mnot.net> wrote:
>
> On 23/06/2012, at 6:08 AM, Roberto Peon wrote:
>
> > I'd argue another point.
> > The amount of work necessary to optimize site performance for HTTP/1.1
> today is large. The amount of knowledge necessary to do it properly is also
> large.
> > This is not the way it should be!
>
+1
> >
> > The protocol should make it easier to do things right, and it should
> help in the (extremely frequent and likely) case that the site designer
> gets it wrong in little ways.
>
> This is definitely an area that should be discussed. I've heard a few
> people express skepticism about multiplexing overall, because it requires
> the server to prioritise what's in the pipe, which in turn requires greater
> knowledge (and probably a bucketload of heuristics).
> Right now those heuristics are applied to how browsers make requests, but
> at least the data is applied in the same place it's most usefully sourced,
> and of course there are fewer browser implementations than there are server
> deployments (which is potentially the level that this kind of tuning would
> need to take place for multiplexing).
>
> Discuss :)
>
I'm a bit surprised people express much skepticism. Basic heuristics can
get you a long way. For example, simply prioritizing html/js/css over other
resources like images does a lot. Remember that the majority of
implementations, including Chromium's and Google's, are still in the early
stages of optimization.
> --
> Mark Nottingham http://www.mnot.net/
>
>
>
>