- From: Martin Thomson <martin.thomson@gmail.com>
- Date: Fri, 14 Mar 2014 14:21:16 -0700
- To: Michael Sweet <msweet@apple.com>
- Cc: Roberto Peon <grmocg@gmail.com>, Patrick McManus <pmcmanus@mozilla.com>, Bjoern Hoehrmann <derhoermi@gmx.net>, HTTP Working Group <ietf-http-wg@w3.org>
On 14 March 2014 14:15, Michael Sweet <msweet@apple.com> wrote: > The client is generally rasterizing pages of content for the printer at some > agreed upon resolution, bit depth, and color space. This raster data is > typically already compressed with a simple algorithm such as PackBits > (run-length encoding) and is thus already variable-length per page with no > way to know ahead of time how large it will be. Add gzip to the mix and you > *really* don't know what the final length will be. This seems like a case of "I know the server capabilities well enough to do this". I'm not sure that we could safely do the same thing for every HTTP client in existence.
Received on Friday, 14 March 2014 21:21:43 UTC