W3C home > Mailing lists > Public > public-html-bugzilla@w3.org > October 2013

[Bug 23169] reconsider the jitter video quality metrics again

From: <bugzilla@jessica.w3.org>
Date: Tue, 22 Oct 2013 16:02:31 +0000
To: public-html-bugzilla@w3.org
Message-ID: <bug-23169-2486-uBxVNVFzOA@http.www.w3.org/Bugs/Public/>

--- Comment #8 from Mark Watson <watsonm@netflix.com> ---
I believe that either of total frame delay and dropped frame count could meet
the requirement.

In either case, the threshold can be a display refresh interval - that is, a
frame is 'late' if it is displayed in the wrong refresh interval.

I still have a mild preference for total frame delay, but without a strong
rationale for that preference ;-)

To answer your questions:

- What is the scenario where late frames are tolerated for a while w/o
triggering frame dropping?

Imagine 30fps content on a 60Hz display, a few frames are rendered late and
then a bunch of frames are rendered at 60fps until we catch up. This might not
be the best UX (dropping to catch up might be better), but it's a possible

- How often do you expect the web application to poll these stats to detect
this condition?

Every second or so.

- How long do you expect the delta between detecting late frames and the media
engine taking action to drop frames would be?

I don't know this, but I believe there can be a scenario where there is late
rendering and no frame dropping.

You are receiving this mail because:
You are the QA Contact for the bug.
Received on Tuesday, 22 October 2013 16:02:36 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 7 January 2015 16:31:45 UTC