- From: Ian Hickson <ian@hixie.ch>
- Date: Thu, 17 Mar 2011 03:19:44 +0000 (UTC)
- To: Frank Olivier <Frank.Olivier@microsoft.com>
- cc: Silvia Pfeiffer <silviapfeiffer1@gmail.com>, "HTML WG (public-html@w3.org)" <public-html@w3.org>
On Thu, 17 Mar 2011, Frank Olivier wrote: > > >> . Setting playbackRate larger than 1.0 for live video will not work. > > Sure it will, so long as the playhead is far enough back that there is > > buffered content to play. > > Sure, but what happens when you play through the buffered content and > get to (effectively) a live feed? The author should be able to determine > that effective playback rate is now 1.0 in this outcome - and setting to > >1.0 should not work. The spec already covers this -- it's the same as what happens if you're playing at the rate of 1.0 but you're receiving only one second's worth of media every two seconds. If we're agreed that the browsers _should_ support this, why would we make it effectively optional? Shouldn't this be a quality-of-implementation issue, where browsers try their best to approximate the requested rate and some do a better job than others? It would be equivalent to how some browsers can render canvas faster than others, or how some browsers render text more beautifully than others. We don't say that browsers that don't support rendering text well should simply refuse to add Text nodes to the DOM, right? Why would we require that browsers decide whether or not they can do a good enough job of going at the requested rate and force them to change the playbackRate accordingly? Would reporting the playback quality (e.g. number of frames rendered over the past second of playback as a fraction of the number of frames that would ideally have been rendered during that same period) be an acceptable alternative solution? -- Ian Hickson U+1047E )\._.,--....,'``. fL http://ln.hixie.ch/ U+263A /, _.. \ _\ ;`._ ,. Things that are impossible just take longer. `._.-(,_..'--(,_..'`-.;.'
Received on Thursday, 17 March 2011 03:20:14 UTC