- From: Bjoern Hoehrmann <derhoermi@gmx.net>
- Date: Tue, 21 May 2013 22:45:43 +0200
- To: "Poul-Henning Kamp" <phk@phk.freebsd.dk>
- Cc: Roland Zink <roland@zinks.de>, ietf-http-wg@w3.org
* Poul-Henning Kamp wrote: >In message <519BAB26.2010501@zinks.de>, Roland Zink writes: > >>This seem to make the introduction of new compression schemes more complex. > >And what is the plausibility that any new compression schemes will ever >make that worth-while ? > >It's not nill, but it makes a convincing impression of nill. There are many existing compression schemes that considerably outperform Deflate in the compression ratio, some even at comparable consumption in resources needed for decompression and, less so, compression. Especially for something like "large JavaScript library" that you would compress only once, something like http://en.wikipedia.org/wiki/.xz is quite near a point where I would expect people to call for adoption of a new scheme (with a filter optimized for such content > 10% better compression seems very plausible to me). If the protocol cannot support adoption of a new scheme, there would be pressure to move compression onto a lower level, think "application/compressed-javascript", and I would rather avoid go- ing there. My http://bjoern.hoehrmann.de/pngwolf/ was able to strip se- veral kilobytes off the Google homepage, that already seemed quite worth it... -- Björn Höhrmann · mailto:bjoern@hoehrmann.de · http://bjoern.hoehrmann.de Am Badedeich 7 · Telefon: +49(0)160/4415681 · http://www.bjoernsworld.de 25899 Dagebüll · PGP Pub. KeyID: 0xA4357E78 · http://www.websitedev.de/
Received on Tuesday, 21 May 2013 20:46:10 UTC