[Bug 23169] reconsider the jitter video quality metrics again

https://www.w3.org/Bugs/Public/show_bug.cgi?id=23169

Mark Watson <watsonm@netflix.com> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |watsonm@netflix.com

--- Comment #1 from Mark Watson <watsonm@netflix.com> ---
Regarding 1, 2, 3:

These are fair points. It's probably not correct to require "to the nearest
microsecond" - the whole thing will always be approximate.

The downstream delay due to frame refresh will be on average half the refresh
interval. So, on average, this could be accounted for.

Regarding 4:

The user of this property should have some notion of the number of frames
displayed, or at least elapsed time which will suffice so long as the frame
rate is roughly constant. The intention is that they would sample this on a
regular basis and evaluate the rate of change. A rate of change below some
threshold is in the noise or indicates perfect rendering. If the rate of change
is above some threshold then this indicates consistent late rendering.

The application is to detect CPU overload, which in some system manifests as
dropped frames but in other systems manifests first as late rendering before
frames are dropped in order to "catch up" (if things don't get back on track).
An app can track the severity of such events over time and decide to stream at
a lower rate on this device.

-- 
You are receiving this mail because:
You are the QA Contact for the bug.

Received on Thursday, 5 September 2013 19:19:25 UTC