W3C home > Mailing lists > Public > ietf-http-wg@w3.org > January to March 1997

Re: Content encoding problem...

From: Dave Kristol <dmk@bell-labs.com>
Date: Wed, 19 Feb 1997 12:28:01 -0500
Message-Id: <330B3821.6EEA4806@bell-labs.com>
To: http-wg@cuckoo.hpl.hp.com
X-Mailing-List: <http-wg@cuckoo.hpl.hp.com> archive/latest/2467
Roy T. Fielding wrote:
> [...]
> Then it sounds like they have digressed, because all of my tests were
> with text/html content with both "x-gzip" and "x-compress" encoding.
> The browsers would retrieve the content and decompress it before
> rendering.  The only ones which did not do so were the Mac-based clients
> which did not (at that time) have a library for gzip decompression.
> This was two years ago (decades in web-years), but I am surprised that
> things would change so much in an incompatible way.
> [...]

Roy's understanding matches mine.  I have to believe there's a miscommunication
here between people, because I believe the software continues to work this way.
If I'm right, then Roy is right that the only thing necessary to put end-to-end
compression on the wire is for servers just to do it:  If the user agent sends an
"Accept-Encoding: gzip" (for example), the server can gzip-compress the content,
on the fly if necessary, and add a "Content-Encoding: gzip" header in the
response.  The original Content-Type would still apply (not necessarily
application/octet-stream).

Dave Kristol
Received on Wednesday, 19 February 1997 09:32:58 UTC

This archive was generated by hypermail 2.4.0 : Thursday, 2 February 2023 18:43:01 UTC