- From: Philip Jägenstedt <philipj@opera.com>
- Date: Wed, 30 Mar 2011 10:29:50 +0200
On Mon, 28 Mar 2011 05:01:51 +0200, Ian Hickson <ian at hixie.ch> wrote: > It would be helpful if you could describe exactly what is easy and what > is > hard (that is, glitchy or simply unsupported by common media frameworks) > in terms of media synchronisation, in particular along the following > axes: > > * multiple in-band tracks vs multiple independent files As long as all tracks are playing synchronously, in-band tracks is easier to deal with, as synchronization happens "for free" as a side effect of being able to play any plain audio+video file. Synchronizing independent files is not something that will be supported in all platform media frameworks, but probably possible to fake to sufficient precision. > * playing tracks synchronised at different offsets > * playing tracks at different rates For independent files, this is again not necessarily natively supported by all platform media frameworks, but as long as they can seek and play at arbitrary rates, it's probably possible to fake to sufficient precision. For in-band tracks, one would have to set up several instances of the same decoding pipeline. > * changing any of the above while media is playing vs when it is stopped Having in-band tracks change between being in sync (same offset and rate) and being out of sync (different offset or rate) would be a major head-ache. There is no concept "stopped" in the current API, only paused, which is exactly like playing (pipeline set up and ready, buffers full) except time is standing still. In other words, it doesn't make a big difference if it's paused or playing. > * adding or removing tracks while media is playing vs when it is stopped For independent files, this should not be much different to starting playback in the first place -- everything will stall until data for all tracks is available. However, when enabling another in-band track, enough information must be available for the browser to know if it can reuse an existing pipeline (sharing the demuxer) or if it has to set up another. In other words, the playback rate and offset that the author is intending to use must be known up-front and not be allowed to change later. > * changing overall playback rate while a synced set of media is playing Should not be much more difficult than changing playback rates of individual media resources. -- Philip J?genstedt Core Developer Opera Software
Received on Wednesday, 30 March 2011 01:29:50 UTC