W3C home > Mailing lists > Public > w3c-wai-ua@w3.org > July to September 1999

Re: checkpoint 6.6

From: Jon Gunderson <jongund@staff.uiuc.edu>
Date: Wed, 18 Aug 1999 08:52:36 -0700
Message-Id: <199908181347.IAA18218@staff1.cso.uiuc.edu>
To: thatch@us.ibm.com, w3c-wai-ua@w3.org
Some types of assistive technology, like TextHelp, ZoomText and Aurora,
use selection to identify information to speak.  This is especially
important for browsers which do not have a cursor for AT to track.  A user
could select a section of text using the mouse (or hopefully directly the
keyboard someday) and use one of these assistive technologies to read the
selected information.  TextHelp and Aurora speech output features are
designed for people with learning disabilities and ZoomText is for people
with visual impairments.

This why it was given a high priority.


At 05:20 PM 8/17/99 -0500, thatch@us.ibm.com wrote:
>Here is 6.6:
>Allow the user to control selection highlighting (e.g., foreground
>and background color).
>Sure that is a nifty idea. It is also another setting, which is complexity.
>But, since selection is usually reverse video, my question is, has that
>ever been done before? Has this checkpoint been given adequate
>If it has never been done before, if no one has seen selectable
>selection colors, then I think this is an example of a priority one
>checkpoint that belongs in the priority three bucket, at best.
>Jim Thatcher
>IBM Special Needs Systems
Jon Gunderson, Ph.D., ATP
Coordinator of Assistive Communication and Information Technology
Division of Rehabilitation - Education Services
University of Illinois at Urbana/Champaign
1207 S. Oak Street
Champaign, IL 61820

Voice: 217-244-5870
Fax: 217-333-0248
E-mail: jongund@uiuc.edu
WWW:	http://www.staff.uiuc.edu/~jongund
Received on Wednesday, 18 August 1999 09:51:09 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 20:38:22 UTC