W3C home > Mailing lists > Public > public-html-a11y@w3.org > April 2011

Re: [media] progress on multitrack api - issue-152

From: Silvia Pfeiffer <silviapfeiffer1@gmail.com>
Date: Tue, 19 Apr 2011 11:28:34 +1000
Message-ID: <BANLkTimFJRjWczRPv34gtrck7zaOpc6cGg@mail.gmail.com>
To: Sean Hayes <Sean.Hayes@microsoft.com>
Cc: HTML Accessibility Task Force <public-html-a11y@w3.org>, "Ian Hickson (ian@hixie.ch)" <ian@hixie.ch>, Paul Cotton <Paul.Cotton@microsoft.com>, "Sam Ruby (rubys@intertwingly.net)" <rubys@intertwingly.net>, "Maciej Stachowiak (mjs@apple.com)" <mjs@apple.com>
Are you in particular referring to a change of the following sentence
in the video element section 4.8.6:
http://www.whatwg.org/specs/web-apps/current-work/multipage/video.html#video or

"A video element is used for playing videos or movies."

Would you prefer this to read something like:

"A video element is used for playing videos, movies, or sound files."

Further a change of:

"The video element is a media element whose media data is ostensibly
video data, possibly with associated audio data."

to something like:

"The video element is a media element whose media data is ostensibly
video data, possibly with associated audio data. This includes
audio-only resources that may be displayed with a representative image
in @poster or with captions or subtitles for visual representation."

Maybe something of this sort can be achieved?

I would, however, suggest to take this up as a separate bug in the
bugtracker, since it doesn't effectively have any relation to
multitrack - it's a broader issue on video elements in general.


On Tue, Apr 19, 2011 at 1:32 AM, Sean Hayes <Sean.Hayes@microsoft.com> wrote:
> If the clarifications I proposed last week are included (i.e. to indicate expressly that the <video> element operates on audio only data, and has a display rectangle to renders captions into, even if there is no video data supplied), then I would formally withdraw my CP in favor of Proposal 4 as amended.
> -----Original Message-----
> From: public-html-a11y-request@w3.org [mailto:public-html-a11y-request@w3.org] On Behalf Of Silvia Pfeiffer
> Sent: 18 April 2011 12:49
> To: HTML Accessibility Task Force
> Subject: Re: [media] progress on multitrack api - issue-152
> Note that I have certainly missed issues, so do speak up if you've
> noticed anything.
> Cheers,
> Silvia.
> On Mon, Apr 18, 2011 at 12:05 AM, Silvia Pfeiffer
> <silviapfeiffer1@gmail.com> wrote:
>> Hi all,
>> In the last media subgroup meeting we further discussed the different
>> change proposals that we have for issue-152.
>> A summary of all the submitted change proposals is at
>> http://www.w3.org/WAI/PF/HTML/wiki/Media_Multitrack_Change_Proposals_Summary
>> .
>> We discussed that Proposal 4 can, with a few changes, provide for the
>> requirements of in-band and externally composed multitrack resources.
>> Proposal 4 introduces an interface for in-band audio and video tracks,
>> and a Controller object to maintain the shared state between the
>> individual media elements that together make up a composed multitrack
>> resource.
>> This email serves two purposes:
>> Firstly it asks others on the accessibility task force whether there
>> are any objections to going with proposal 4 (Philip?, Geoff?). The
>> people present at the meeting agreed that they would be prepared to
>> withdraw their change proposals in favor of proposal 4. This include
>> all proposals numbered 1 to 3 on the summary page.
>> Secondly it summarizes the remaining issues that we would like
>> addressed for proposal 4.
>> The remaining issues are:
>> (1) videoTracks should be MultipleTrackList, too:
>> The current HTMLMediaElement has the following IDL to expose in-band
>> media tracks:
>>  readonly attribute MultipleTrackList audioTracks;
>>  readonly attribute ExclusiveTrackList videoTracks;
>> The objection is to the use of ExclusiveTrackList on videoTracks. It
>> should be allowed to have multiple in-band video tracks activated at
>> the same time. In particular it seems that MP4 files have a means of
>> specifying how multiple video tracks should be displayed on screen and
>> Safari is already able to display such.
>> In contrast, proposal 4 requires that only one in-band video track can
>> be active and displayed into the video viewport at one time. If more
>> than one video track is to be displayed, it needs to be specified with
>> a media fragment URI in a separate video element and connected through
>> a controller.
>> Some questions here are: what do other browsers want to do with
>> multiple in-band video tracks? Does it make sense to restrict the
>> display to a single video track? Or should it be left to the browser
>> what to do - in which case a MultipleTrackList approach to videoTracks
>> would be sensible? If MultipleTrackList is sensible for audio and
>> video, maybe it could further be harmonized with TextTrack.
>> (2) interface on TrackList:
>> The current interface of TrackList is:
>>  readonly attribute unsigned long length;
>>  DOMString getName(in unsigned long index);
>>  DOMString getLanguage(in unsigned long index);
>>           attribute Function onchange;
>> The proposal is that in addition to exposing name and language
>> attributes - in analogy to TextTrack it should also expose a label and
>> a kind.
>> The label is necessary to include the track into menus for track
>> activation/deactivation.
>> The kind is necessary to classify the track correctly in menus, e.g.
>> as sign language, audio description, or even a transparent caption
>> track.
>> (3) looping should be possible on combined multitrack:
>> In proposal 4 the loop attribute on individual media elements is
>> disabled on multitrack created through a controller, because it is not
>> clear what looping means for the individual element.
>> However, looping on a multitrack resource with in-band tracks is well
>> defined and goes over the complete resource.
>> In analogy, it makes sense to interpret loop on a combined multitrack
>> resource in the same way. Thus, the controller should also have a
>> muted attribute which is activated when a single loop attribute on a
>> slave media element is activated and the effect should be to loop over
>> the combined resource, i.e. when the duration of the controller is
>> reached, all slave media elements' currentTime-s are reset to
>> initialPlaybackPosition.
>> (4) autoplay should be possible on combined multitrack:
>> Similar to looping, autoplay could also be defined on a combined
>> multitrack resource as the union of all the autoplay settings of all
>> the slaves: if one of them is on autoplay, the whole combined resource
>> is.
>> (5) more events should be available for combined multitrack:
>> The following events should be available in the controller:
>> * onloadedmetadata: is raised when all slave media elements have
>> reached at minimum a readyState of HAVE_METADATA
>> * onloadeddata: is raised when all slave media elements have reached
>> at minimum a readyState of HAVE_CURRENT_DATA
>> * canplaythrough: is raised when all slave media elements have reached
>> at minimum a readyState of HAVE_FUTURE_DATA
>> * onended: is raised when all  slave media elements are in ended state
>> or said differently: these events are raised when the last slave in a
>> group reaches that state.
>> These are convenience events that will for example help write combined
>> transport bars. It is easier to attach just a single event handler to
>> the controller than to attach one to each individual slave and make
>> sure they all fire. Also, they help to maintain the logic of when a
>> combined resource is loaded. Since these are very commonly used
>> events, their introduction makes sense.
>> Alternatively or in addition, readyState could be added to the controller.
>> (6) controls on slaves control the combined multitrack:
>> Proposal 4 does not provide any information on what happens with media
>> elements when the @controls attribute is specified. Do the controls
>> stay in sync with the controls of the other elements? Do they in fact
>> represent combined state? Do they represent the state of the slave
>> resource? What happens when the user interacts with them? Is the
>> information on the interaction - in particular seeking, muting, volume
>> change, play/pause change, rate change - handed on to the controller
>> and do the others follow?
>> Hopefully we can move forward on all of these issues before the 22nd
>> April deadline for issue-152.
>> Best Regards,
>> Silvia.
Received on Tuesday, 19 April 2011 01:29:21 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:55:54 UTC