W3C home > Mailing lists > Public > w3c-wai-ua@w3.org > October to December 2011

Actions 678 and 679

From: Kim Patch <kim@redstartsystems.com>
Date: Thu, 15 Dec 2011 12:13:16 -0500
Message-ID: <4EEA2AAC.6040204@redstartsystems.com>
To: WAI-UA list <w3c-wai-ua@w3.org>, Wayne Dick <wayneedick@gmail.com>

Write note for top of document about overarching principle of modality 
independence [on Kimberly Patch - due 2011-12-15].

*The Modality Independence Principal*

Users interacting with a web browser may do so using one or more input 
methods including keyboard, mouse, speech, touch, and gesture. It's 
critical that each user be free to use whatever input method or 
combination of methods works best for a given situation. Therefore every 
potential user interaction must be accessible via modality independent 
controls that any input technology can access.

For instance, if a user can't use or doesn't have access to a mouse, but 
can use and access a keyboard, the keyboard can call a modality 
independent control to activate an OnMouseOver event.

Write a combo of 261-263 to one SC, and intents, etc. [on Kimberly Patch 
- due 2011-12-15].

        2.6.X Activate any event handlers:

The user can call up a list of input device event handlers 
explicitly associated with the keyboard focus 
element, and activate any one or more. (Level A)

Intent of Success Criterion 2.6.X :
Users interacting with a web browser may be doing so by using one or 
more input technologies including keyboard, mouse, speech, touch, and 
gesture. No matter how the user is controlling the user agent, the user 
needs to know all the input methods assigned to a particular piece of 
content.  At the same time, anyone input method should not arbitrarily 
hold back another. For instance, people who don't use a mouse shouldn't 
necessarily have to map their input methods to the same steps a mouse 
user would take.

Examples of Success Criterion 2.6.X :

          o Jeremy cannot use a mouse. He needs to activate a flyout
            menu that normally appears OnMouseOver. Jeremy can navigate
            to a link on this flyout menu and activate it using keyboard

          o Ken is a speech input user. In order to get his work done in
            a reasonable amount of time and without overtaxing his voice
            he uses a single speech command phrase to move the mouse up,
            left and click.
          o Karen cannot use a mouse. She clicks a single key to
            activate both events for link that has an onmousedown and an
            onmouseup event link.

Related Resources for Success Criterion 2.6.X :

The Modality Independence Principle X.X


Kimberly Patch
Redstart Systems, Inc.
(617) 325-3966

www.redstartsystems.com <http://www.redstartsystems.com>
- making speech fly

Blog: Patch on Speech
+Kim Patch
Twitter: RedstartSystems
Received on Thursday, 15 December 2011 17:12:59 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 7 January 2015 14:49:40 UTC