W3C home > Mailing lists > Public > whatwg@whatwg.org > February 2009

[whatwg] Video playback quality metric

From: James Graham <jgraham@opera.com>
Date: Tue, 10 Feb 2009 10:38:01 +0100
Message-ID: <49914AF9.10804@opera.com>
Jeremy Doig wrote:
> Measuring the rate at which the playback buffer is filling/emptying gives a
> fair indication of network goodput, but there does not appear to be a way to
> measure just how well the client is playing the video itself. If I have a
> wimpy machine behind a fat network connection, you may flood me with HD that
> I just can't play very well. The cpu or video card may just not be able to
> render the video well.Exposing a metric (eg: Dropped Frame count, rendered
> frame rate) would allow sites to dynamically adjust the video which is being
> sent to a client [eg: switch the url to a differently encoded file] and
> thereby optimize the playback experience.
> Anyone else think this would be good to have ?

It seems like, in the short term at least, the "worse is better" 
solution to this problem is for content providers to provide links to 
resources at different quality levels, and allow users to choose the 
most appropriate resource based on their internet connection and their 
computer rather than having the computer try to work it out for them. 
Assuming that the majority of users use a relatively small number of 
sites with the resources to provide multiple-quality versions of their 
videos and use a small number of computing devices with roughly 
unchanging network conditions (I imagine this scenario applies to the 
majority of non-technical), they will quickly learn which versions of 
the media works best for them on each site. Therefore the burden of this 
simple approach on end users does not seem to be very high.

Given this, I would prefer automatic quality negotiation be deferred to 
  HTML6.
Received on Tuesday, 10 February 2009 01:38:01 UTC

This archive was generated by hypermail 2.3.1 : Monday, 13 April 2015 23:08:47 UTC