- From: Philip Jägenstedt <philipj@opera.com>
- Date: Fri, 18 Mar 2011 22:34:05 +0100
- To: "David Singer" <singer@apple.com>
- Cc: public-html@w3.org
On Fri, 18 Mar 2011 21:53:54 +0100, David Singer <singer@apple.com> wrote: > > On Mar 18, 2011, at 1:39 , Philip Jägenstedt wrote: > >> >> In principle, this seems OK, if a bit unnecessary. We already have the >> raw snapshot metric for determining playback speed: currentTime. Would >> actualPlaybackRate be the derivate of that over a defined period of >> time? >> >> Anyway, it's not at all clear to me what scripts would actually do with >> this information. Tell the users that their browsers sucks? > > > Please remember that there are sources that might not be seekable at > all. For example, if I have a URL form to address a TV tuner, you are > either tuned in, playing at 1.0, or not. Similarly, a hypothetical URL > that asks for the source to be your camera cannot do anything. If your > connection is RTSP/RTP, you can ask for non-1.0 playback rates, but the > server might 'suck' and refuse. > > So it might not be your browser (or mine) that sucks. Of course, I was being ironic. It's depressingly common for web authors to tell users to "get a better browser" or other kinds of insults in this kind of situation. If there's no better use case than annoying users for allowing scripts to detect this situation, I think we should not change the API at all. -- Philip Jägenstedt Core Developer Opera Software
Received on Friday, 18 March 2011 21:34:43 UTC