Re: WOFF 2.0 Decoder Performance Analysis (preliminary)

On Wed, May 20, 2015 at 11:23 AM, Sergey Malkin <sergeym@microsoft.com>
wrote:

>  To be clear, MB/s speeds and time to unpack in this document refer to
> the size of original uncompressed font. Is this correct?
>

Yes, that is correct.  Kenji went into much more detail and explained the
rationale elegantly via the following tracking Chrome bug:

https://code.google.com/p/chromium/issues/detail?id=447127

>
>
> This would be also helpful to know hardware configuration used for each
> operating system.
>

I agree, data from a controlled test environment would be most useful.  We
should plan to collect more together going forward.

This data set was collected from real users running Chrome in the wild, and
this is a great (likely) indicator of actual real-world performance.  Thus
very different from that of our supped up development machines. :)

>
>
> Thanks,
>
> Sergey
>
>
>
> *From:* Levantovsky, Vladimir [mailto:Vladimir.Levantovsky@monotype.com]
> *Sent:* Wednesday, May 20, 2015 10:58 AM
> *To:* David Kuettel; public-webfonts-wg@w3.org
> *Cc:* kenjibaheux@chromium.org
> *Subject:* RE: WOFF 2.0 Decoder Performance Analysis (preliminary)
>
>
>
> Thank you David, Kenji and the entire team for gathering and analyzing the
> data.
>
>
>
> I am curious if it would be possible to extend this analysis to account
> for font data transmission times. It’s highly likely that the environments
> where the decoder performance was measured do not support the data transfer
> rates comparable even with the slowest decoder performance. So, the
> question of interest would be a cumulative result of using WOFF2 – i.e.
> whether the reduction of time required to load a more efficiently
> compressed WOFF2 font and the extra time required to decompress it compared
> to WOFF1 would produce the net performance benefit. (I think it should
> because it should take less time to load the data and more time to
> decompress but the decompression rates are many times higher than average
> bandwidth.)
>
>
>
> If we could measure the cumulative times for analysis [between the moment
> the request is made and the moment the WOFF2 data is decompressed] that
> includes and compares combined load / decompression times – this might
> yield very useful and interesting results.
>
>
>
> Thank you,
>
> Vlad
>
>
>
>
>
> *From:* David Kuettel [mailto:kuettel@google.com <kuettel@google.com>]
> *Sent:* Wednesday, May 20, 2015 1:36 PM
> *To:* public-webfonts-wg@w3.org
> *Cc:* kenjibaheux@chromium.org
> *Subject:* Re: WOFF 2.0 Decoder Performance Analysis (preliminary)
>
>
>
> Here is an updated link to the document:
>
>
>
>
> https://docs.google.com/document/d/1LaaOng1dFyhvM5-MhM6irfmpp_pdF4jv0n6dzCKisT8/edit?usp=sharing
>
>
>
> Many thanks to Sergey for the correction!
>
>
>
> On Wed, May 20, 2015 at 9:58 AM, David Kuettel <kuettel@google.com> wrote:
>
> Hello working group,
>
>
>
> A quick update on action item 116: decoder performance on mobile devices
>
> *http://www.w3.org/Fonts/WG/track/actions/116
> <http://www.w3.org/Fonts/WG/track/actions/116>*
>
>
>
> The Chrome team added instrumentation (thank you Kenji!) to measure the
> WOFF 1.0 and 2.0 decode times, and gathered the data across multiple
> platforms.
>
>
>
> The following is a preliminary analysis of the data that was collected on
> April 22nd.  Note however that there are many limitations of this approach,
> which could be adversely affecting the results.
>
>
>
> *WOFF 2.0 Decoder Performance Analysis (preliminary)*
>
>
> https://docs.google.com/document/d/1LaaOng1dFyhvM5-MhM6irfmpp_pdF4jv0n6dzCKisT8/
>
>
>
> One incredibly exciting outcome of the early data, is that the Google
> Compression team has been working there magic to make the decoding even
> faster (thank you!).  Stay tuned for exciting updates on this front.
>
>
>
> Thank you,
>
> David
>
>
>

Received on Wednesday, 20 May 2015 20:07:36 UTC