W3C home > Mailing lists > Public > ietf-http-wg@w3.org > January to March 2014

Re: Support for gzip at the server #424

From: Nicolas Mailhot <nicolas.mailhot@laposte.net>
Date: Mon, 17 Mar 2014 08:59:30 +0100
Message-ID: <c8de40466b84c502723d9b8053962797.squirrel@arekh.dyndns.org>
To: "Bjoern Hoehrmann" <derhoermi@gmx.net>
Cc: "Martin Thomson" <martin.thomson@gmail.com>, "HTTP Working Group" <ietf-http-wg@w3.org>

Le Ven 14 mars 2014 01:56, Bjoern Hoehrmann a écrit :
> * Martin Thomson wrote:
>>Discussion offline leads me to conclude that doing this would be bad
>> idea.
>>
>>The basic problem is with 2->1.1 translation at intermediaries.  While
>>the HTTP/2 request might include a Content-Length, the gateway is
>>going to be forced to decompress and buffer the entire request body
>>before forwarding to a 1.1 server.
>>
>>This doesn't seem like a good outcome.  Maybe this is something to
>>defer to the version of HTTP/x that ships when most of the world is
>>already using HTTP/2.
>
> I am not really following what this thread is about. HTTP/1.1 does not
> (de jure) need Content-Length anywhere, requests and responses can both
> be compressed and chunked. Why does that not matter here?

Content-Length can be used by intermediaries to dispatch big downloads
(isos, etc) to specific anti-malware scanner where they won't block
processing of interactive web browsing

Regards,

-- 
Nicolas Mailhot
Received on Monday, 17 March 2014 08:00:16 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 17:14:24 UTC