Re: Compressing HTML

At 12:28 AM 09/09/97 -0700, Andrew Daviel wrote:
>I just wondered what if anything had happened to the idea of compressing
>HTML. On my Unix server with Unix Netscape I can serve a regular 123kb
>file, or the same one with Content-encoding: x-gzip at 3268 bytes.
>On Windows it doesn't work. 
>I believe that gzip is required for compliant VRML viewers, and there
>is certainly a gzip binary for Win.x.
>Although many last-mile links like 28.8 modems use compression
>I believe that most backbone links, proxy cache etc. do not, so
>it would seem to be a useful feature.

Compression of HTML can be a big win in many situations. It can also be
improved if you canonicalize the HTML in certain ways before compressing
it. You can have a look at the performance work that we have done [1] at
W3C [2] and Jeff Mogul at DEC/WRL [3].

Libwww [4] has an implementation of on-the-fly decompression using the Zlib
library, which we used to make our measurements. ZLib is written in C and
is available for most platforms including windows.

Henrik

[1] http://www.w3.org/Protocols/HTTP/Performance/
[2] http://www.w3.org
[3] http://www.acm.org/sigcomm/sigcomm97/papers/p156.ps
[4] http://www.w3.org/Library/

--
Henrik Frystyk Nielsen,
http://www.w3.org/People/Frystyk
PGP:0x1F71508D 

Received on Tuesday, 9 September 1997 22:22:23 UTC