- From: Al Gilman <Alfred.S.Gilman@IEEE.org>
- Date: Tue, 9 Dec 2008 15:38:54 -0500
- To: WAI XTech <wai-xtech@w3.org>
On 4 Dec 2008, at 7:59 PM, Janina Sajka wrote: > > James Craig writes: >> I would expect this to be a user setting in the assistive technology, >> rather than something specified by the author of a web application. For the user settings to work, the media assets have to come with metadata to a known schema so the selectors in the disposition rules would catch the right stuff. This points directly at the "required vocabulary, rules, and orderwire communication for personalization" topic that Rich is trying to get going in the Ubiquitous Web Applications WG (formerly known as Device Independence). The FLUID project is directly involved in developments in this space. http://fluidproject.org/ A stab at framing the general problem is available at http://lists.w3.org/Archives/Public/www-multimodal/2007Jun/att-0002/ Accessibility_Notes_on_MMI_Architecture.html .. which you will find is linked from the HTML Wiki page on multimedia access at http://esw.w3.org/topic/HTML/MultimediaAccessibilty Note the discussion on "packaging and selecting options" involved in the Timed Text thread: http://lists.w3.org/Archives/Public/public-tt/2008Dec/thread.html#msg67 Which is related to the structure sketch Laura forwarded to us http://lists.w3.org/Archives/Public/wai-xtech/2008Dec/0061.html > Actually, AT is itself often guilty of the same oversight. However, I > would agree content authoring wouldn't be considered with particular > devices. It might need to identify content type, e.g. the audio > description stream in Spansish, the audio description stream in > English, > the main movie audio in Spanish, the main movie audio in English, etc. > It might then be up to the browser to support directing one of these > streams to a particular audio device. Does Safari support that today? > IE? I don't believe I've seen this in Firefox. > > Janina > >> >> >> On Dec 3, 2008, at 7:31 AM, Janina Sajka wrote: >> >>> >>> As we continue development of Web 2.0 functionality, there's at >>> least >>> one key requirement I believe we've left unaddressed. I do not see >>> that >>> we've provided for the very real possibility that our user devices >>> might >>> have more than one audio device. And, which audio device should be >>> used >>> for particular kinds of content is likely to matter a great deal to >>> the >>> user. >>> >>> While I do think this requirement is generalizable to all input/ >>> output >>> modalities, I want to outline a couple use cases specifically for >>> audio >>> device differentiation. I must also point out that it's not just >>> our >>> web specs that don't seem to support directing audio to one >>> particular >>> out of several available devices. OS specificity in this regard is >>> also >>> less than adequate, in my experience. Of course, for device >>> specificity >>> in web specs to succeed, OS support would also need hardening. >>> >>> 1.) Vo/IP >>> >>> Users of Skype, SIP and IAX services are very likely to use a >>> headset >>> device. This will often be a second audio device on the host system, >>> and not just another input/output option to the default audio device >>> on >>> that system (especially when that user relies on a screen reader). >>> >>> 2.) High End Media Access >>> >>> Professional musicians and consumers invested in high quality audio >>> (and >>> multimedia) experiences will often add higher quality audio >>> devices to >>> their systems with the intention that certain media types be >>> directed >>> to >>> those devices. >>> >>> * The parent setting up a movie for the family to watch >>> * will probably not want the screen reader mixed into the >>> * movie's audio output. Indeed, some might wish audio >>> * description routed to only certain devices, and not >>> * others. >>> >>> * The musician studying (or creating) a particular >>> * composition will certainly not want screen reading (or >>> * system sonicons) mixed in that composition. >>> >>> There are other examples, but I expect these will serve to >>> illustrate >>> my >>> point. We need the ability to direct certain media types to >>> particular >>> devices. When these exist on user systems, they exist for a reason, >>> and >>> those reasons must be honored for applications to succeed. >>> >>> -- >>> >>> Janina Sajka, Phone: +1.202.595.7777; >>> sip:janina@CapitalAccessibility.Com >>> Partner, Capital Accessibility LLC http://CapitalAccessibility.Com >>> >>> Marketing the Owasys 22C talking screenless cell phone in the >>> U.S. and >>> Canada >>> Learn more at http://ScreenlessPhone.Com >>> >>> Chair, Open Accessibility janina@a11y.org >>> Linux Foundation http://a11y.org >>> >>> > > -- > > Janina Sajka, Phone: +1.202.595.7777; > sip:janina@CapitalAccessibility.Com > Partner, Capital Accessibility LLC http://CapitalAccessibility.Com > > Marketing the Owasys 22C talking screenless cell phone in the U.S. > and Canada > Learn more at http://ScreenlessPhone.Com > > Chair, Open Accessibility janina@a11y.org > Linux Foundation http://a11y.org > >
Received on Tuesday, 9 December 2008 20:39:39 UTC