[Bug 23169] reconsider the jitter video quality metrics again

https://www.w3.org/Bugs/Public/show_bug.cgi?id=23169

--- Comment #8 from Mark Watson <watsonm@netflix.com> ---
I believe that either of total frame delay and dropped frame count could meet
the requirement.

In either case, the threshold can be a display refresh interval - that is, a
frame is 'late' if it is displayed in the wrong refresh interval.

I still have a mild preference for total frame delay, but without a strong
rationale for that preference ;-)

To answer your questions:

- What is the scenario where late frames are tolerated for a while w/o
triggering frame dropping?

Imagine 30fps content on a 60Hz display, a few frames are rendered late and
then a bunch of frames are rendered at 60fps until we catch up. This might not
be the best UX (dropping to catch up might be better), but it's a possible
behaviour.

- How often do you expect the web application to poll these stats to detect
this condition?

Every second or so.

- How long do you expect the delta between detecting late frames and the media
engine taking action to drop frames would be?

I don't know this, but I believe there can be a scenario where there is late
rendering and no frame dropping.

-- 
You are receiving this mail because:
You are the QA Contact for the bug.

Received on Tuesday, 22 October 2013 16:02:36 UTC