W3C home > Mailing lists > Public > public-html@w3.org > March 2011

Re: Change Proposal for ISSUE-147

From: Philip Jägenstedt <philipj@opera.com>
Date: Fri, 18 Mar 2011 09:39:33 +0100
To: public-html@w3.org
Message-ID: <op.vsi6b70csr6mfa@localhost.localdomain>
On Thu, 17 Mar 2011 21:37:27 +0100, Ian Hickson <ian@hixie.ch> wrote:

> On Thu, 17 Mar 2011, Philip Jägenstedt wrote:
>> This CP assumes that the UA knows beforehand which playback rates it can
>> support. Like many things in media, the only way of knowing for sure may
>> be to try it, so how should a UA handle a situation like that?
> It also seems like what playback rate is achievable might change in real
> time, either based on changing stream characteristics, or based on
> the CPU load varying. Also, some platforms have a limit to how many
> simultaneous streams they can decode, in which case some streams would be
> decoding at the actual rate (.playbackRate) and some would be decoding at
> zero rate.
> What might make sense is to do something like what Silvia proposed, but
> instead of changing the existing API, just adding an API that returns
> current playback metrics. That is, have playbackRate and
> defaultPlaybackRate work as specced now, but add a .metrics object that
> includes amongst other things an .actualPlaybackRate attribute that gives
> the actual result. It would make a lot of sense to have this if we add to
> it the other metrics that browsers are exposing in vendor extensions.

In principle, this seems OK, if a bit unnecessary. We already have the raw  
snapshot metric for determining playback speed: currentTime. Would  
actualPlaybackRate be the derivate of that over a defined period of time?

Anyway, it's not at all clear to me what scripts would actually do with  
this information. Tell the users that their browsers sucks?

Philip Jägenstedt
Core Developer
Opera Software
Received on Friday, 18 March 2011 08:40:08 UTC

This archive was generated by hypermail 2.3.1 : Thursday, 29 October 2015 10:16:10 UTC