- From: Silvia Pfeiffer <silviapfeiffer1@gmail.com>
- Date: Sat, 2 Apr 2011 04:21:42 +1100
- To: Mark Watson <watsonm@netflix.com>
- Cc: Eric Carlson <eric.carlson@apple.com>, Ian Hickson <ian@hixie.ch>, "public-html-a11y@w3.org" <public-html-a11y@w3.org>
On Sat, Apr 2, 2011 at 2:13 AM, Mark Watson <watsonm@netflix.com> wrote: > > On Mar 31, 2011, at 7:58 PM, Eric Carlson wrote: > >> >> On Mar 31, 2011, at 3:41 PM, Silvia Pfeiffer <silviapfeiffer1@gmail.com> wrote: >> >>> On Fri, Apr 1, 2011 at 1:49 AM, Eric Carlson <eric.carlson@apple.com> wrote: >>>> >>>> On Mar 30, 2011, at 9:22 PM, Silvia Pfeiffer wrote: >>>> >>>>> But I am told, UAs render them simply over each other >>>>> at the top left corner. In any case, there will be a default rendering >>>>> IIUC. >>>>> >>>> This is not quite right. Every visual track in MPEG-4 and QuickTime containers, at least, has a display matrix that determines where it is rendered. The movie's display box is the union of all visual track display boxes, eg. the size and position of a visual track affects the size of its movie. >>>> >>>> It is certainly *possible* to render an in-band track in the top left corner, and that may be the default in some media authoring software, but it is not a requirement. >>> >>> >>> Thanks for the clarification. It confirms though that multiple in-band >>> video tracks are indeed rendered by default into the existing video >>> viewport unless they are somehow turned off. >>> >> Correct. >> >>> I now wonder: is there actually a means to turn them off and just use >>> them in a separate audio or video element with a fragment identifier? >> >> Using an audio track in another element is easy as long as it will be played in sync with the other elements(s) that use the same file. If it could possibly be played out of sync, it will be necessary to open up another instance of the file and enable/disable tracks appropriately. >> >> Some media engines allow a client to render each track to a separate bitmap/surface, so what I wrote about audio can also be true for video tracks. However, some media engines always composite video tracks, so it a client of one of these will always have to open up a separate copy of the file for each video track they want to render. >> >>> In multitrack in-band resources, are the multiple media tracks >>> typically all activated by default or how is the decision made whether >>> to render them? >>> >> A track's metadata specifies its initial state (enabled, display matrix, volume, etc), so the file's author is in control. > > All of the above depends on the media file format. Eric started with a description of MPEG-4 and QuickTime containers - I wonder if anyone knows the case for WebM ? WebM is only defined for one audio and one video track for now, IIUC. So, there is no such thing as auto-enabling more tracks. I'm not sure, however, what the Matroska container does. The Ogg container certainly only makes tracks available and leaves all decisions about whether to display them to the consuming application. Also, it's not just a matter of the container format - it's also a matter of the media framework you are using. It is possible that while QuickTime acts upon the states of the MPEG-4 file, other frameworks don't. Is there anyone with more information on this or do we need to run some experiments? > Aside from that, a likely case for multi-track audio and video (IMO) is when the @src points to an adaptive streaming manifest containing multiple tracks. The default behavior here is undefined, but since the tracks are usually labelled in the manifest with some semantic information (language, main audio vs commentary etc.) one could expect the player to choose one video and one audio track to start with by default. Yes, manifests are another way for providing multitrack. I think we have a lot more control what we do with a manifest though - in particular if we create or adapt a specific format. I would think that auto-enabling of multi-tracks could be avoided. Silvia.
Received on Friday, 1 April 2011 17:22:34 UTC