- From: Silvia Pfeiffer <silviapfeiffer1@gmail.com>
- Date: Mon, 18 Apr 2011 00:05:13 +1000
- To: HTML Accessibility Task Force <public-html-a11y@w3.org>
Hi all, In the last media subgroup meeting we further discussed the different change proposals that we have for issue-152. A summary of all the submitted change proposals is at http://www.w3.org/WAI/PF/HTML/wiki/Media_Multitrack_Change_Proposals_Summary . We discussed that Proposal 4 can, with a few changes, provide for the requirements of in-band and externally composed multitrack resources. Proposal 4 introduces an interface for in-band audio and video tracks, and a Controller object to maintain the shared state between the individual media elements that together make up a composed multitrack resource. This email serves two purposes: Firstly it asks others on the accessibility task force whether there are any objections to going with proposal 4 (Philip?, Geoff?). The people present at the meeting agreed that they would be prepared to withdraw their change proposals in favor of proposal 4. This include all proposals numbered 1 to 3 on the summary page. Secondly it summarizes the remaining issues that we would like addressed for proposal 4. The remaining issues are: (1) videoTracks should be MultipleTrackList, too: The current HTMLMediaElement has the following IDL to expose in-band media tracks: readonly attribute MultipleTrackList audioTracks; readonly attribute ExclusiveTrackList videoTracks; The objection is to the use of ExclusiveTrackList on videoTracks. It should be allowed to have multiple in-band video tracks activated at the same time. In particular it seems that MP4 files have a means of specifying how multiple video tracks should be displayed on screen and Safari is already able to display such. In contrast, proposal 4 requires that only one in-band video track can be active and displayed into the video viewport at one time. If more than one video track is to be displayed, it needs to be specified with a media fragment URI in a separate video element and connected through a controller. Some questions here are: what do other browsers want to do with multiple in-band video tracks? Does it make sense to restrict the display to a single video track? Or should it be left to the browser what to do - in which case a MultipleTrackList approach to videoTracks would be sensible? If MultipleTrackList is sensible for audio and video, maybe it could further be harmonized with TextTrack. (2) interface on TrackList: The current interface of TrackList is: readonly attribute unsigned long length; DOMString getName(in unsigned long index); DOMString getLanguage(in unsigned long index); attribute Function onchange; The proposal is that in addition to exposing name and language attributes - in analogy to TextTrack it should also expose a label and a kind. The label is necessary to include the track into menus for track activation/deactivation. The kind is necessary to classify the track correctly in menus, e.g. as sign language, audio description, or even a transparent caption track. (3) looping should be possible on combined multitrack: In proposal 4 the loop attribute on individual media elements is disabled on multitrack created through a controller, because it is not clear what looping means for the individual element. However, looping on a multitrack resource with in-band tracks is well defined and goes over the complete resource. In analogy, it makes sense to interpret loop on a combined multitrack resource in the same way. Thus, the controller should also have a muted attribute which is activated when a single loop attribute on a slave media element is activated and the effect should be to loop over the combined resource, i.e. when the duration of the controller is reached, all slave media elements' currentTime-s are reset to initialPlaybackPosition. (4) autoplay should be possible on combined multitrack: Similar to looping, autoplay could also be defined on a combined multitrack resource as the union of all the autoplay settings of all the slaves: if one of them is on autoplay, the whole combined resource is. (5) more events should be available for combined multitrack: The following events should be available in the controller: * onloadedmetadata: is raised when all slave media elements have reached at minimum a readyState of HAVE_METADATA * onloadeddata: is raised when all slave media elements have reached at minimum a readyState of HAVE_CURRENT_DATA * canplaythrough: is raised when all slave media elements have reached at minimum a readyState of HAVE_FUTURE_DATA * onended: is raised when all slave media elements are in ended state or said differently: these events are raised when the last slave in a group reaches that state. These are convenience events that will for example help write combined transport bars. It is easier to attach just a single event handler to the controller than to attach one to each individual slave and make sure they all fire. Also, they help to maintain the logic of when a combined resource is loaded. Since these are very commonly used events, their introduction makes sense. Alternatively or in addition, readyState could be added to the controller. (6) controls on slaves control the combined multitrack: Proposal 4 does not provide any information on what happens with media elements when the @controls attribute is specified. Do the controls stay in sync with the controls of the other elements? Do they in fact represent combined state? Do they represent the state of the slave resource? What happens when the user interacts with them? Is the information on the interaction - in particular seeking, muting, volume change, play/pause change, rate change - handed on to the controller and do the others follow? Hopefully we can move forward on all of these issues before the 22nd April deadline for issue-152. Best Regards, Silvia.
Received on Sunday, 17 April 2011 14:06:00 UTC