W3C home > Mailing lists > Public > ietf-http-wg@w3.org > January to March 2007

Re: Reducing HTTP payload size [was: RE: HTTP idea

From: David Morris <dwm@xpasc.com>
Date: Thu, 11 Jan 2007 18:30:06 -0800 (PST)
To: Nicholas Shanks <contact@nickshanks.com>
cc: <ietf-http-wg@w3.org>
Message-ID: <Pine.LNX.4.33.0701111822170.5719-100000@egate.xpasc.com>



On Thu, 11 Jan 2007, Nicholas Shanks wrote:

> Wouldn't large sites encode their traffic with gzip (or alternative)
> before sending?

Hard to generalize from extensive case by case examination, but Google is
the only site I've observed which uses gzip. Over the past several years
I've spent a lot of time measuring page load timing for  well and not so
well known sites and not observed use of gzip.

Dave Morris
Received on Friday, 12 January 2007 02:30:27 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Friday, 27 April 2012 06:50:00 GMT