- From: Sam Ruby <rubys@intertwingly.net>
- Date: Wed, 16 Mar 2011 23:58:14 -0400
- To: Ian Hickson <ian@hixie.ch>
- CC: "HTML WG (public-html@w3.org)" <public-html@w3.org>
On 03/16/2011 11:19 PM, Ian Hickson wrote: > On Thu, 17 Mar 2011, Frank Olivier wrote: >> >>>> . Setting playbackRate larger than 1.0 for live video will not work. >>> Sure it will, so long as the playhead is far enough back that there is >>> buffered content to play. >> >> Sure, but what happens when you play through the buffered content and >> get to (effectively) a live feed? The author should be able to determine >> that effective playback rate is now 1.0 in this outcome - and setting to >>> 1.0 should not work. > > The spec already covers this -- it's the same as what happens if you're > playing at the rate of 1.0 but you're receiving only one second's worth of > media every two seconds. > > If we're agreed that the browsers _should_ support this, why would we make > it effectively optional? Shouldn't this be a quality-of-implementation > issue, where browsers try their best to approximate the requested rate and > some do a better job than others? It would be equivalent to how some > browsers can render canvas faster than others, or how some browsers render > text more beautifully than others. We don't say that browsers that don't > support rendering text well should simply refuse to add Text nodes to the > DOM, right? Why would we require that browsers decide whether or not they > can do a good enough job of going at the requested rate and force them to > change the playbackRate accordingly? > > Would reporting the playback quality (e.g. number of frames rendered over > the past second of playback as a fraction of the number of frames that > would ideally have been rendered during that same period) be an acceptable > alternative solution? If you have something concrete to propose, please do so in the form of a Change Proposal by the 18th. - Sam Ruby
Received on Thursday, 17 March 2011 03:58:50 UTC