W3C home > Mailing lists > Public > www-talk@w3.org > September to October 1997

Re: Compressing HTML

From: Henrik Frystyk Nielsen <frystyk@w3.org>
Date: Tue, 09 Sep 1997 22:01:48 -0400
Message-Id: <>
To: Andrew Daviel <advax@triumf.ca>, www-talk@w3.org
At 12:28 AM 09/09/97 -0700, Andrew Daviel wrote:
>I just wondered what if anything had happened to the idea of compressing
>HTML. On my Unix server with Unix Netscape I can serve a regular 123kb
>file, or the same one with Content-encoding: x-gzip at 3268 bytes.
>On Windows it doesn't work. 
>I believe that gzip is required for compliant VRML viewers, and there
>is certainly a gzip binary for Win.x.
>Although many last-mile links like 28.8 modems use compression
>I believe that most backbone links, proxy cache etc. do not, so
>it would seem to be a useful feature.

Compression of HTML can be a big win in many situations. It can also be
improved if you canonicalize the HTML in certain ways before compressing
it. You can have a look at the performance work that we have done [1] at
W3C [2] and Jeff Mogul at DEC/WRL [3].

Libwww [4] has an implementation of on-the-fly decompression using the Zlib
library, which we used to make our measurements. ZLib is written in C and
is available for most platforms including windows.


[1] http://www.w3.org/Protocols/HTTP/Performance/
[2] http://www.w3.org
[3] http://www.acm.org/sigcomm/sigcomm97/papers/p156.ps
[4] http://www.w3.org/Library/

Henrik Frystyk Nielsen,
Received on Tuesday, 9 September 1997 22:22:23 UTC

This archive was generated by hypermail 2.4.0 : Monday, 20 January 2020 16:08:22 UTC