- From: Dave Kristol <dmk@bell-labs.com>
- Date: Wed, 19 Feb 1997 12:28:01 -0500
- To: http-wg@cuckoo.hpl.hp.com
Roy T. Fielding wrote: > [...] > Then it sounds like they have digressed, because all of my tests were > with text/html content with both "x-gzip" and "x-compress" encoding. > The browsers would retrieve the content and decompress it before > rendering. The only ones which did not do so were the Mac-based clients > which did not (at that time) have a library for gzip decompression. > This was two years ago (decades in web-years), but I am surprised that > things would change so much in an incompatible way. > [...] Roy's understanding matches mine. I have to believe there's a miscommunication here between people, because I believe the software continues to work this way. If I'm right, then Roy is right that the only thing necessary to put end-to-end compression on the wire is for servers just to do it: If the user agent sends an "Accept-Encoding: gzip" (for example), the server can gzip-compress the content, on the fly if necessary, and add a "Content-Encoding: gzip" header in the response. The original Content-Type would still apply (not necessarily application/octet-stream). Dave Kristol
Received on Wednesday, 19 February 1997 09:32:58 UTC