W3C home > Mailing lists > Public > ietf-http-wg@w3.org > April to June 2012

Re: Performance implications of Bundling and Minification on HTTP/1.1

From: Poul-Henning Kamp <phk@phk.freebsd.dk>
Date: Sat, 23 Jun 2012 08:29:47 +0000
To: "Martin Nilsson" <nilsson@opera.com>
cc: "HTTP Working Group" <ietf-http-wg@w3.org>
Message-ID: <4489.1340440187@critter.freebsd.dk>
In message <op.wgbxldx3iw9drz@manganese.bredbandsbolaget.se>, "Martin Nilsson" 

>Also, some HTTP requests are rewritten by proxies and anti-virus  
>applications to disable compression, so compression will be used even less.

... and they have a good reason to disable gzip:  These devices sit at the
"choke-points" in the network and see very high if not the highest
HTTP-traffic densities of any devices in the HTTP domain.

There are two subcases, and they are quite different:


Typically a load-balancer which needs only to inspect the "Host:"
header and/or the URI in the request and the status code of the

These are the devices I call "HTTP routers", and they are where
all the traffic bottlenecks when the entire world tries to find
out what happened in Dallas.

HTTP/2.0 should serialize (at least) these crucial fields without
gzip and preferably in a way that makes it very easy and cheap to
find them.


Almost always content-scanning, and since there are legitimate
use cases (Prison inmates for instance) we have to accept this
role as legitimate[1].

A legitimate argument exists, that censors should pay the cost
of censorship.  If we accept that, these boxes should not be
able to force clients/servers to forego compression.

Poul-Henning Kamp       | UNIX since Zilog Zeus 3.20
phk@FreeBSD.ORG         | TCP/IP since RFC 956
FreeBSD committer       | BSD since 4.3-tahoe    
Never attribute to malice what can adequately be explained by incompetence.
Received on Saturday, 23 June 2012 08:30:13 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 17:14:00 UTC