[Bug 23169] reconsider the jitter video quality metrics again

https://www.w3.org/Bugs/Public/show_bug.cgi?id=23169

Aaron Colwell <acolwell@google.com> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |acolwell@google.com

--- Comment #2 from Aaron Colwell <acolwell@google.com> ---
I'm still not a huge fan of this metric but here might be a compromise that can
address David's conserns and hopefully be acceptable to Mark.

I propose that we introduce the concept of a "late" frame that represents a
frame that was displayed, but not in the correct screen refresh. We then add a
lateVideoFrames counter to track the number of frames that fit this criteria.

I'm still not sure whether this is going to be overly useful to the application
so I am going to try to outline the various scenarios that I could see
happening. For all these examples, I am going to assume a 60Hz refresh rate and
that only one frame will actually be displayed per refresh interval. I'll also
use notation like R0, R1, R2 to describe individual refresh intervals.

Scenario 1: Clip frame rate > refresh rate.
Say we have 240fps content. Since the refresh rate is only 60 fps, I'd expect
that droppedVideoFrames would increment by 3 for every 4 totalVideoFrames
because only 1 out of the 4 frames for each refresh interval would get
displayed. In this case we don't need late frames.

Scenario 2: Clip frame rate == refresh rate.
For 60fps content, I would expect that dropppedVideoFrames would reflect any
missed refresh intervals. For example, if a frame was supposed to be displayed
in R0 but wasn't displayed until R2, I would expect that the frames that should
have been displayed in R1 & R2 would cause the droppedVideoFrames counter to
increment twice because these frames were "too late" to display. If we add the
concept of a "late" frame, then I would expect the lateVideoFrame count to be
incremented by 1 since 1 frame missed its deadline. The droppedVideoFrames
would roughly reflect how late it was.

Scenario 3: Clip frame rate < refresh rate.
Say we have 15fps content. I'd expect frames to be delivered at R0, R4, R8,
etc. If the R0 frame is displayed in R1,R2, or R3 it increments the
lateVideoFrames counter because the display deadline was missed. This late
display would not cause the droppedVideoFrames counter to increment because the
display of another frame was not effected. If the R0 frame was displayed at
R4-R7 then I'd expect the lateVideoFrames & droppedVideoFrames counter to both
increment because we had 1 late display and this also resulted in a dropped
frame.


>From my perspective Scenario 3 is the only one where I think we would benefit
from the "late" frame concept. It isn't clear to me whether it would provide a
huge benefit especially if the clip frame rate and the refresh rate are pretty
close to eachother. The benefit would seem to increase the lower the frame
rate, but low frame rate content doesn't usually tax the CPU as heavily so it
doesn't seem like the application would have much room to adapt downward
anyways. Also any delays larger than the clip frame rate would show up as
dropped frames so I'm not sure what this extra signal is buying us.

-- 
You are receiving this mail because:
You are the QA Contact for the bug.

Received on Friday, 13 September 2013 20:06:07 UTC