W3C home > Mailing lists > Public > ietf-http-wg@w3.org > January to March 2007

Re: Reducing HTTP payload size [was: RE: HTTP idea

From: Henrik Nordstrom <hno@squid-cache.org>
Date: Fri, 12 Jan 2007 04:57:30 +0100
To: David Morris <dwm@xpasc.com>
Cc: ietf-http-wg@w3.org
Message-Id: <1168574250.29785.23.camel@henriknordstrom.net>
tor 2007-01-11 klockan 18:30 -0800 skrev David Morris:

> Hard to generalize from extensive case by case examination, but Google is
> the only site I've observed which uses gzip. Over the past several years
> I've spent a lot of time measuring page load timing for  well and not so
> well known sites and not observed use of gzip.

google is far from alone in using gzip.

dynamic "content-encoding: gzip" appears to be spreading quite rapidly
over the Internet, largely because it's trivial to configure in most web
servers these days.

I don't have any readily available statistics, but it looks like it's a
fairly significant percentage of the web sites using this today.

Now if they only did it correctly and not as if "content-encoding" was
"transfer-encoding". But we already been in that discussion.. but on the
good side at least some vendors start to recognize that on-the-fly gzip
is really best done as transfer-encoding and not content-encoding..

But related to this discussion I expect to have quite interesting
statistics wrt HTTP header compression in a few weeks. I'll take a note
to return with this data.


Received on Friday, 12 January 2007 03:57:42 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 1 March 2016 11:10:41 UTC