Re: ACTION-656: reconciling 2.3.2, 2.3.x, and 2.3.4

I think this is ok.

I had some thought that 23x was to generalize (and make the SC a bit
more technical...programatic and perceivable labels) 232 and 234. But,
I can find no reference or discussion. It seems to have happened
during TPAC. 23x appears in the 4 November draft, but is not in the 20
October draft.

Ah ha, it was 2.5.1 and was moved to 23x, but there is no record of
discussion. I guess it seemed to fit in GL 2.3


On Thu, Mar 22, 2012 at 2:29 PM, Greg Lowney
<gcl-0039@access-research.org> wrote:
> Re ACTION-656:  "[Greg] And Kim to reconcile duplications of 2.3.2, 2.3.x
> and 2.3.4 all about presenting direct commands in content"...
>
> Kim and I looked over these three SC and decided the clear redundancy is
> best fixed by deleting 2.3.x. That leaves 2.3.2 and 2.3.4 pretty good by
> themselves. Two minor things remain with them:
>
> 1. 2.3.2 and 2.3.4 are entirely and appropriately parallel, except that
> 2.3.4 ends with a lengthy parenthetical example. I'd delete that example, as
> it's pretty much redundant to the Examples in the Implementing document, and
> I don't feel it's necessary for understanding the SC.
>
> 2. Both Intent paragraphs are pretty weak. In fact, an editorial pass could
> probably combine the best bits from all four Intent paragraphs dealing with
> direct navigation, and replicate them into each of the SC.
>
>
> For reference, here are the two we recommend keeping:
>
> 2.3.2 Present Direct Commands in Rendered Content (former 2.1.6):
>
> The user can have any recognized direct commands in rendered content (e.g.
> accesskey, landmark) be presented with their associated elements. (Level A)
>
>
> Intent of Success Criterion 2.3.2:
> Make it easy to for users to discover or be reminded of keyboard shortcuts
> and similar commands without leaving the context in which they're working.
> Easy keyboard access is especially important for people who cannot easily
> use a mouse. An example of this is mouseless browsing. Some users have
> problems controlling the mouse and/or the keyboard. Therefore users often
> find control by speech recognition to be advantageous. In this case it is
> much more efficient for navigation and activation selection points to be
> both viewable by the user and controllable by their assistive technology.
> Examples of Success Criterion 2.3.2:
>
> Fiona uses an audio browser. When the system reads form controls in the
> rendered content, it reads the label of the form followed by the accesskey
> (e.g., "name alt plus n").
> Mary cannot use the mouse or keyboard due to a repetitive strain injury,
> instead she uses voice control technology with a mouse-less browsing plug-in
> to her browser. The plug-in overlays each hyperlink in rendered content with
> a number that can then be used to directly select it by speaking a command
> (e.g. "select link 12"). This prevents Mary from having to say the word
> 'tab' numerous times to get to her desired hyperlink.
>
> Related Resources for Success Criterion 2.3.2:
>
> See 2.1.7 for User Interface commands
> Mouseless Browsing Firefox Extension:
> https://addons.mozilla.org/en-us/firefox/addon/mouseless-browsing/
> Perceivable navigation and activation keys:
> http://www.mouseless.de/index.php?/content/view/17/30/
> Microsoft placing Wikipedia on TV-DVD and using mouseless browsing via
> remote control: http://research.microsoft.com/research/tem
>
>
> 2.3.4 Present Direct Commands in User Interface (former 2.1.7):
>
> The user can have any direct commands (e.g. keyboard shortcuts) in the user
> agent user interface be presented with their associated user interface
> controls (e.g. "Ctrl+S" displayed on the "Save" menu item and toolbar
> button). (Level AA)
>
>
> Intent of Success Criterion 2.3.4:
> For many users, including those who use the keyboard or and input method
> such as speech, the keyboard is often a primary method of user agent
> control. It is important that direct keyboard commands assigned to user
> agent functionality be discoverable as the user is exploring the user agent.
> Examples of Success Criterion 2.3.4:
>
> Vlad is a keyboard-only user who uses a browser on the Mac OS operating
> system. When he needs to perform a new operation with the browser user
> interface, he searches for it in the menus and notes whether the menu item
> has a " ⌘ " label (e.g. "Copy ⌘-C"), which indicates the direct activation
> command he can use in the future to avoid having to traverse the menus.
> Amir uses ability switches to control an onscreen keyboard for the Windows
> operating system. When he presses the "alt" key the available browser user
> interface accesskeys are shown as overlays on the appropriate user interface
> controls (e.g. "File with 'F' in an overlay").
>
> Related Resources for Success Criterion 2.3.4:
>
> To be written
>
>
>
> And here is the one we recommend deleting:
>
> 2.3.x Discover navigation and activation keystrokes (former 2.5.1):
>
> The user can discover direct navigation and activation keystrokes both
> programmatically and via perceivable labels. (Level A)
>
>
> Intent of Success Criterion 2.3.x :
> Examples of Success Criterion 2.3.x :
>
> To be written
>
> Related Resources for Success Criterion 2.3.x :
>
> 2.3.1 and 2.3.2
>
>
>
>     Thanks,
>     Greg and Kim



-- 
Jim Allan, Accessibility Coordinator & Webmaster
Texas School for the Blind and Visually Impaired
1100 W. 45th St., Austin, Texas 78756
voice 512.206.9315    fax: 512.206.9264  http://www.tsbvi.edu/
"We shape our tools and thereafter our tools shape us." McLuhan, 1964

Received on Thursday, 29 March 2012 17:04:05 UTC