W3C home > Mailing lists > Public > public-html-a11y@w3.org > July 2010

Re: Media--Technical Implications of Our User Requirements

From: Janina Sajka <janina@rednote.net>
Date: Wed, 14 Jul 2010 17:06:08 -0400
To: HTML Accessibility Task Force <public-html-a11y@w3.org>
Message-ID: <20100714210608.GL2452@sonata.rednote.net>
Eric Carlson writes:
> On Jul 13, 2010, at 8:51 PM, Janina Sajka wrote:
> > Colleagues:
> > 
> > The following is a first-pass, high level description of the content types and API mechanisms our user requirements expose.  It is meant to be a list of
> > things we need to support in our eventual change proposal. it is intentionally agnostic
> > regarding any particular technology.
> > 
> >          + 2.5 Content Navigation by Content Structure
> > 
> > A structured data file.
> > 
> > NOTE: Data in this file is used to synchronize all media representations
> > available for a given content publication, i.e. whatever audio, video, and
> > text document--default and alternative--versions may be provided.
> > 
>   To me, "synchronize all media representations" means the data file will be used to control the *playback* of the media file. Is this what you intended, or is the data file supposed to be used to allow *navigation* of the media file? 
It has to be achieved somehow, of course. And, however it's achieved,
whatever alternative media representations are selected must track with the default content.

It does also seem to me that the playback control data file should
contain structural nav points as well. In fact, it seems Ian is trending
in this direction:

> >          + 3.1 Access to interactive controls / menus
> > 
> > An API providing access to:
> > 
> > Stop/Start
> > Pause
> > Fast Forward and Rewind (time based)
> > time-scale modification control
> > volume (for each available audio track)
> > pan location (for each available audio track)
> > pitch-shift control
> > audio filters

>   You said that this is meant to be a list of "things we need to
>   support in our eventual change proposal", when did programmatic
>   access to real-time audio filters become a requirement? CA-4 says
>   only "Potentially support pre-emphasis filers, pitch shifting, and
>   other audio processing algorithms". 

Well, we need to define "other audio processing" more closely.
Regardless, I see theis list we do have in the reqs doc as covered under
a general concept of audio "filters."

We know this kind of audio tweaking can make a tremendous difference to
persons with significant hearing impairments. So, we need some kind of
(better defined) support in our controls API in order to be able to
provide accessible controls for this. Frankly, such controls aren't of
great use non real time because you don't hear the results of your
tweaks immediately.

Again, I don't see this as a requirement on all user agents. I do see it
as a requirement on the API, for the benefit of those user agents that
do implement that kind of control/configuration.

> > Viewport content selection, on screen location and sizing control
>   I don't understand what this means 

Our captioning and sign translation requirements include user control of
where on screen, and how much of the screen, is given to captions and/or
signing vs. the default media content.

This is my poor attempt to name that kind of control. Better suggestions
very welcome.

> > Extended descriptions and extended captions configuration/control
>   How does one configure descriptions and captions?

I suspect this is a simple binary control. Either pause the default
media to allow the extended alternative content to complete, or truncate
presentation of alternative content in order to keep the default media
playing without interruption.

Perhaps there are other things that should be in this control, but I
can't think of what as of yet.

> > Ancillary content configuration/control
> > 
>   I still don't see a definition of "ancillary" in the requirements document.  

First starred item under 2.5 should probably add that word in the second
sentence to read:

	"They also have ancillary structures such as ..."

It's currently hidden in the paragraph that begins:

	"Note that, just as printed books may have  ..."

While this paragraph names TV commercials as "ancillary structures," it
would be good to expand on that so that the definition isn';t so
weighted on book publication.

Possibly "Outakes, interviews, are examples of ancillary structures
commonly found on movie DVDs today."



Janina Sajka,	Phone:	+1.443.300.2200

Chair, Open Accessibility	janina@a11y.org	
Linux Foundation		http://a11y.org

Chair, Protocols & Formats
Web Accessibility Initiative	http://www.w3.org/wai/pf
World Wide Web Consortium (W3C)
Received on Wednesday, 14 July 2010 21:06:43 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 7 January 2015 15:05:12 UTC