WOFF 2.0 (Brotli) Compression Update

Hello working group,

Just a quick update...

The Google Compression team has been heads down on building the WOFF
2.0 compression / decompression tools,
test framework and Brotli algorithm.  The early work is incredibly
exciting, see below for the full update.

In parallel Fil and Kenji-san are going to start the open-sourcing
process for the code, which will allow the compression team to
continue to focus on improvements to Brotli.

Additionally, Jryki, Zoltan, Raph, Kenji-san, Fil and I are all
planning on attending the working group meeting in Amsterdam in
person.  We are really excited and looking forward to seeing everyone
there.

More updates coming next week.  Thank you!

---------- Forwarded message ----------
From: Zoltan Szabadka, Jyrki Alakuijala

I can give a quick update on the current status of brotli. I measured
compression ratio and decompression time with my benchmarking tools
over the google fonts (1194 files, 156317376 bytes total, see
/home/szabadka/google-fonts), both with and without glyf
transformation.

Without transforms, the total compressed size over the whole corpus
using gzip, lzma and brotli are 71414228, 53925660, and 60490148 bytes
respectively, and the decompression speeds are 185 MB/s, 40 MB/s and
96 MB/s.

With transforms, the compressed sizes are 60844848, 47984708 and
52967496 and decompression speeds are 100 MB/s, 36 MB/s and 66 MB/s.

All compressions were run with continue streams enabled.

We are still making improvements to brotli, however, and we expect
that using a zopfli-style algorithm to compute the backward references
we can get an additional 4% improvement on compression ratio.

Received on Wednesday, 11 September 2013 19:02:41 UTC