W3C home > Mailing lists > Public > public-html@w3.org > March 2011

Re: Change Proposal for ISSUE-147

From: David Singer <singer@apple.com>
Date: Fri, 18 Mar 2011 13:53:54 -0700
Cc: public-html@w3.org
Message-Id: <E83D0D85-62ED-47B3-861B-EB4EF9E40536@apple.com>
To: Philip Jägenstedt <philipj@opera.com>

On Mar 18, 2011, at 1:39 , Philip Jägenstedt wrote:

> 
> In principle, this seems OK, if a bit unnecessary. We already have the raw snapshot metric for determining playback speed: currentTime. Would actualPlaybackRate be the derivate of that over a defined period of time?
> 
> Anyway, it's not at all clear to me what scripts would actually do with this information. Tell the users that their browsers sucks?


Please remember that there are sources that might not be seekable at all.  For example, if I have a URL form to address a TV tuner, you are either tuned in, playing at 1.0, or not.  Similarly, a hypothetical URL that asks for the source to be your camera cannot do anything. If your connection is RTSP/RTP, you can ask for non-1.0 playback rates, but the server might 'suck' and refuse.

So it might not be your browser (or mine) that sucks.

David Singer
Multimedia and Software Standards, Apple Inc.
Received on Friday, 18 March 2011 20:54:28 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 9 May 2012 00:17:26 GMT