Re: Announcing new font compression project

On Fri, Mar 30, 2012 at 2:37 PM, John Hudson <tiro@tiro.com> wrote:
> On 27/03/12 3:08 PM, Raph Levien wrote:
>> We consider the format to be lossless, in the sense that the _contents_ of
>> the font file are preserved 100%. That said, the decompressed font is not
>> bit-identical to the source font, as there are many irrelevant details such
>> as padding and redundant ways of encoding the same data (for example, it's
>> perfectly valid, but inefficient to repeat flag bytes in a simple glyph,
>> instead of using the repeat code). A significant amount of the compression
>> is due to stripping these out.
>
> I wonder how this compares to the standard of losslessness required by the
> WOFF spec?

It's very close.

Taking a virgin file and round-tripping it through the codec gives you
a file that's rendering-identical, but not bit-identical.  If you
roundtrip the result, though, the result from the second pass is
bit-identical to the result from the first.  So it's still possible to
do checksum/hash-based signing of font files with this format; they
just have to do a single round-trip through the format first to get to
a stable state.


> Raph, presuming that this new compression method is judged worthwhile --
> which seems likely --, how do you see it progressing? Is this something that
> you hope to be adopted by W3C as e.g. WOFF 2.0?

Yes, that's the goal we're hoping for!

~TJ

Received on Friday, 30 March 2012 21:48:05 UTC