Re: Scott Isensee: ANSI standard

People have reported problem with both the attachment itself and its
format.

Sorry for this.

The original as received from Scott is at 
  http://www.w3.org/WAI/group/ansiaces.ps

I'm attaching inline the same document thru ps2ascii.
(paragraph are not wrapped into lines though)


PS: I just got back from a short vacation and have to finish writing a
TIDE proposal for WAI funding this week, so don't expect much activity
from me in the next few days.



==================================

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 1

Section 5: Accessibility Attention Reviewers Reviewers -- In addition to any comments you may add to the margins and text, please mark each guideline with a quick relevance rating:

1 = Very Important (Must be in the standard) 2 = Important (Probably should be in the standard) 3 = Neutral (May or may not belong in the standard) 4 = Unimportant (Probably should not be in the standard) 5 = Very Unimportant (Definitely does not belong in the standard)

Regarding references -- if you can provide additional references supporting a guideline from expert practice, from published guidelines (better), or research (best) please note those reference(s) in the margin.   References are preceded by a code letter indicating

D (editor recommendation). All D level references must be reviewed for upgrade to at least C level in order to be used.

and this standard in general: human-computer interaction. The entire standard is intended to apply across different systems and technologies, but primarily focusing on "office computing" software. Hardware is not covered. Also, details of implementation or operating systems are

keyboard".   The method may vary across systems, allowing system developers/ manufacturers to decide on implementation.

Lastly, although most editors who view this document may experience chest pains, be aware that most of the uninspired format and structure of this document are dictated by committee and ANSI styles.   If there are style issues you have concerns about, please note them, and I will bring them to the attention of the folks in charge of document style for the committee (most of the ANSI conventions are beyond our control).

as are recommendations on content to be added or removed.

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 2

5.1 Definitions For the purposes of this standard, the following definitions apply:

5.1.1

normally used for data entry, which invoke an action immediately without displaying intermediate information (such as menus) or requiring pointer movement or any other user activity. Also called "shortcut keys" and "hot keys." (9241/14.3.1; 9241/15.3.11)

5.1.2 accessibility: The set of properties that allows a product, service, or

facility to be used by people with a wide range of capabilities. Although "accessibility" typically addresses users who have a disability, the concept is not limited to disability issues.

5.1.3 activation: The initiation of an action associated with a selected object. 5.1.4 assistive technologies: Hardware or software products used by people

with disabilities to accomplish their tasks. Examples include Braille displays, screen readers, screen magnification software, and eye tracking devices.

5.1.5 blind user: Anyone who uses a non-visual interface (auditory or tactile)

as the only means of interaction with a computer that is designed for primarily visual interaction.

5.1.6 BounceKeys: A feature, often implemented in the system software, that

allows users to set a delay between one keystroke and the software's acceptance of the next keystroke.

5.1.7

one button is held down simultaneously.

5.1.8 contrast: The difference between the color, luminosity, reflectance, or

shading of an image and the background of the image.

5.1.9 cursor: The visual indication of where the user interaction via keyboard

will occur. See also "focus cursor" and "text cursor." Contrast with "pointer."

5.1.10

a hearing aid and does not hear sound below 90 decibels.

5.1.11 direct accessibility: Accessibility that is provided without the use of

assistive hardware or software that has been added to an "off the shelf" design. (B:VAN92a).

5.1.12 disability: An impairment that interferes with the customary manner in

a task. Note that the legal definitions of a disability vary from country to country, and may differ from the definition stated here.

5.1.13 explicit focus: A condition in which windows, objects, and controls

receive input focus when the pointer is over them and the pointer button

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 3

in which explicit focus must be assigned. See also implicit focus. 5.1.14 focus cursor: A graphic indicator showing what object has keyboard

See also Input Focus, Focus Indicator, and Text Cursor. 5.1.15 focus indicator: An indicator that shows which window or pane has

bars, etc., and all other windows show only the outlines. See also Input Focus, Text Cursor, and Focus Cursor.

5.1.16

tion even with the use of a hearing aid.

5.1.17 impairment: Any deficit in psychological, physiological, or anatomical

structure or function. An impairment is not a disability if it does not interfere with task performance. See also "disability.".

5.1.18 implicit focus: A condition in which windows, objects, and controls

receive input focus when the pointer simply passes over them, rather than when a pointer button is pressed. Keyboard navigation provides an implicit focus policy by giving focus to whatever object currently is indicated by the focus cursor. Contrast with explicit focus.

5.1.19 input focus: The current assignment of the input from the keyboard or

equivalent to a user interface object (a window or an object within a win

ual object, focus is indicated by a focus cursor.  Two kinds of input focus are pointer focus and keyboard focus. See "focus cursor."

5.1.20 keyboard: A hardware device (or logical equivalent) consisting of a num

of keys (e.g. on-screen keyboard) or it may not (e.g., voice recognition). 5.1.21 keyboard equivalents: Keys or key combinations that provide keyboard

access to functions usually activated by a pointing device.

5.1.22 latch: If pressed while in latch mode, a modifier key such as Shift, Alt, or

Ctrl remains logically pressed in combination with a single subsequent keypress.

5.1.23 lock: If pressed while in lock mode, a modifier key such as Shift, Alt, or

Ctrl acts remains logically pressed in combination with any number of subsequent keypresses until lock mode is turned off.

5.1.24 low-vision user: Anyone who reads a standard screen display at signifi-

cantly higher than default character size even with use of corrective lenses.

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 4

5.1.25 mnemonic: A letter on a menu or control indicating the corresponding

key that, when pressed, activates that choice.

5.1.26 MouseKeys: A system software feature providing keyboard control of

pointer movement and mouse button functions.

5.1.27 physically impaired user: Anyone whose motor function significantly

affects use of a standard keyboard or pointing device.

5.1.28 pointer: A user interface object represented by a graphical symbol that

moves on the screen to reflect the user's manipulation of a pointing device and/or the current state of the dialog or system. Users interact

tion and starting a manipulation of that object. Although the pointer is sometimes called a "pointing cursor," this document uses the word "cursor" only for a keyboard focus indicator. (9241/16.3.14)

5.1.29

delay prior to the onset of key repeat, allowing users who have limited coordination to release keys.

5.1.30 screen reader: Assistive software that allows blind users to navigate

through windows, determine the state of controls, and read text through braille or text-to-speech conversion.

5.1.31 ShowSounds: A system software feature that provides a software flag

indicating that all sounds should be accompanied by a visual alternative (e.g., flashes for beeps, closed captions for speech).

5.1.32 SlowKeys: A system software feature that enables users to modify the

delay prior to keypress acceptance.   This prevents users who have limited coordination from accidentally pressing keys.

5.1.33

it difficult or impossible to press and/or hold multiple keys simultaneously. 5.1.34 text cursor: The visual indication of the current insertion point for text

entry, i.e., the character position where text will be inserted into a text

processor). Contrast with "pointer" and "focus cursor.". See also Input Focus, Focus Indicator, and Focus Cursor

5.1.35 ToggleKeys: A system software feature that enables users to have the

system produce a sound when the current state of a binary state keyboard toggle control such as "Caps Lock" or "Num Lock" changes.

5.2 Introduction

the usability of software for users who have disabilities. According to the World Health Organization, over 10 percent of the world's population has a disability (B:THO93). In

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 5

addition, many countries now have legal requirements for accessibility of information technologies (B:THO93, B:BER95, B:MAN91, B:PER91, B:MCC94).

(B:BER95, B:NEW93, B:EDW93). In the case of human-computer interaction, providing taking into account varying physical and sensory capabilities across user populations. A user who has hearing impairments, for example, may find software easier to use because it provides subtitles for audio information. A user who is blind may find a web

general, the guidelines in this section of the standard focus on those issues that disproportionately impact people with disabilities.

As the discussion above implies, the boundary between guidelines "for disability" and general guidelines in ANSI 200 is necessarily fuzzy. For these reasons, guidelines of particular relevance to users who have disabilities may be listed here in addition to

its location in other sections.

5.3 Scope Accessibility may be provided by built-in software and hardware or add-on assistive software and hardware. These guidelines are aimed at reducing the need for add-on

technologies, when they are required. Assistive technologies typically provide specialized input and output capabilities not provided by the system. Software examples include on-screen keyboards that replace

navigate through applications, determine the state of controls, and read text via text to speech conversion. Hardware examples include head-mounted pointers that replace mice and braille output devices that replace a video display.

the extent that systems and applications integrate with those technologies. For this to allow software to operate effectively with add-on assistive software and hardware as recommended in these guidelines. If systems do not provide support for assistive technologies, the probability increases that users will encounter problems with

use system provided mechanisms (such as customization for color, font, and audio, or

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 6

blocked. 5.4 User characteristics The user characteristics of people with any given disability vary significantly just as any of the issues typically encountered by individuals with various disabilities, and do not constitute a comprehensive list. People may concurrently experience more than one of the disabilities outlined below.   The needs of people who have such combinations of disabilities are covered in most cases by the overlap across guidelines.

5.4.1 Issues commonly encountered by users who are blind

presentation. Many users who are blind interact with computers through "screen readers" -- assistive software that can provide a spoken or braille description of windows, controls, menus, images, text and other information typically displayed on visually on a screen.

screen readers. To the extent that interactions depend on understanding a spatial are more likely to encounter difficulties. In addition, because many users who are blind are reading screens by means of synthesized speech output, they may find it difficult or impossible to attend to auditory outputs that occur while they are reading.

5.4.2 Issues commonly encountered by users who have low-vision The issues commonly faced by users who have low-vision include color perception deficits, impaired contrast sensitivity, and loss of depth perception.

People who have low-vision use different means of increasing the size, contrast, and

or software magnification to enlarge portions of the display. When interacting with computers, these users may not pickup size coded information, have difficulty with font discrimination, and encounter difficulties locating or tracking interface objects such as pointers, cursors, drop targets, hot spots, and direct manipulation handles.

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 7

5.4.3 Issues commonly encountered by users who have hearing impairments The issues commonly faced by users who have hearing impairments who retain some functional hearing include inability to discriminate frequency changes, decreased frequency range and dropout, difficulties localizing sounds, and difficulty picking up sounds against background noise.

Users who have hearing impairments may or may not use electronic hearing aids, operating system, they may use the "ShowSounds" feature, that notifies software to present audio information in visual form.

frequencies, or of low volume. Customization is key to providing them with access. 5.4.4 Issues commonly encountered by users who are deaf In addition to a general inability to pick up auditory information, the issues commonly voice input systems, and experience with English as a second language (sign language early age). If it is available, users who are deaf will typically use the "ShowSounds" feature that notifies software to present audio information in visual form.

or where sound is turned off or cannot be used (e.g., a library). 5.4.5 Issues commonly encountered by users who have physical impairments The issues commonly faced by users who have physical impairments often follow from physical limitations including poor coordination, weakness, difficulty reaching, and inability to move a limb.

Users with physical impairments may or may not use assistive technologies, and the variety of hardware and software they employ is too large to describe in detail in this space. A few examples, however, include eyetracking devices, on-screen keyboards, speech recognition, and alternative pointing devices.

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 8

in needs and capabilities among this user population means that customization of input parameters and timing is extremely important for effective access.

5.5 General guidelines While it may not be possible to make all systems accessible without add-on assistive people who will require assistive technologies, and increase usability of systems when assistive technologies are used.

used. For example, system software that provides built-in screen magnification, also promote integration of assistive technologies by providing information that can be read by assistive technologies, and by communicating through standard application-toapplication communication protocols. (B:THO93).

5.5.1 Use system standard input/output To enable people to use assistive technologies effectively, software should use systemprovided input and output methods wherever possible, and if system routines must be bypassed, set or obtain system state information using system variables or system routines. (B:VAN92a, B:BER95).

Example: Software moves a text cursor using system software routines. This allows assistive software to read current cursor position.

better performance. The software provides an option that detects the state of an "assistive technology flag". When the flag is set, the software uses the system routines for graphics.

but copies that text to the system text buffer so that text is readable by assistive technology.

5.5.2 Enable user choice Software should enable as many input and output alternatives as possible (B:VAN92a, B:THO93).

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 9

5.5.3 Use of input/output alternatives them to reconfigure or restart the system.

Note: This transparency aids non-impaired users and impaired users working together on the same system.

the mouse and type in text. a key sequence, and a third user chooses a menu item. 5.5.4 User preferences

output characteristics without having to restart the system (B:IBM88).

volume, and pointer control settings that apply everywhere on the system. Example: A software application allows users to configure and save settings for font size and style within a particular window.

5.5.5 User setting of timed responses

a limited time in order for that response to be valid, the time range should be adjustable B:MS96). 5.5.6 Provide object labels Software should provide object labels meaningful to users stored as accessible text, whether those labels are visually presented or not (B:VAN92a, B:BER95)

Note: Labels name an object. Examples include icon names, window titles, and button labels. For many simple objects such as a "pencil" icon, the name may be

and descriptions.

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 10

have a visible label a popup label appears. is not shown, non-visible labels such as icon variable names are assigned a meaningful name ("eraser") that may be recognized and read to the user by assistive software when the object has focus or the pointer moves over it.

hearing rather than seeing the screen gets appropriate contextual information. Example: Compound objects that consist of a collection of other objects have a group label. A web page image composed of a series of smaller image files, for

image label: "full view of a building", "bulldozer", "dump truck", "crane" and so on. 5.5.7 Provide object descriptions Where tasks require access to visual content of objects beyond what a label provides, whether those descriptions are visually presented or not (B:VAN92a).

Note: Users who have low-vision or are blind may use software that can present

but the description gives details and context providing access to users who are blind, who have low-vision, or are accessing information without a display (e.g., over a phone or in a car).

Example:   An image labeled "Construction Site" might have the following the corner of two busy city streets. A crane is hoisting an enormous steel beam of a chain link fence."

Western Europe, with a jagged line across Spain indicating where the glacial advance stopped in the last ice age".

Example: A screen reader reads alternative tag names in HTML that describe graphic image content.

Example: A graphic encyclopedia's animation provides a stored textual description: "A lava flow pours from the volcano, covering the town below within seconds."

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 11

5.5.8 Notify about new information Information indicating when new text or graphic information is presented should be available to users and to assistive technologies.

on an offscreen task). Example: Software allows the user to request a beep when new information is displayed.

Example: Assistive software such as a screen reader or magnifier can read an event variable indicating that a new message has been posted in a particular window.

5.5.9  Notification of object events

but are not limited to changes in state such as selection and position and changes in attributes such as size and color.

Example: When a user selects an item in a list box, assistive software is able to determine that an event has occurred in the list box.

to determine that the icon has changed position. Example: When a user causes a push-button to gain focus, assistive software is able to determine that focus has changed to that button.

5.5.10  Information on object attributes Information on individual object attributes should be available to assistive technologies. Such attributes include, but are not limited to object size, position, and current state.

Example: Assistive software can get the foreground and background colors of a button.

Example: Assistive software can determine the boundaries, font, and color of an area of text on the screen.

Example: Assistive software can determine the word, sentence, and paragraph boundaries of an area of text on the screen.

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 12

5.5.11 Enable use with or without a pointing device Software should enable users to perform tasks effectively both using only a pointing device (B:VAN92a) or without using any pointing device.

Example: Users enter text using an on-screen keyboard, and perform all other functions through menus, buttons, and screen commands. A user with no movement capabilities in any limbs might use only a pointing device such as an eye-tracer or mouth operated joystick.

Example: Users move input focus among and between windows displayed from different software using only keyboard input.

Example: Within text input areas, users use keyboard input to move the focus a significant amount of text off-screen, then users use keyboard input to move or last page or screen of text.

items, and perform other pointer activated tasks via keyboard input. 5.5.12 Enable persistent activation When users can activate a menu, control, or other user interface object to display additional information, software should allow that information to persist while the user engages in other tasks until the user chooses to dismiss it (B:THO93).

users who have language or cognitive disabilities, and reduces the number of steps required to access them.

them while navigating and using other menus. 5.5.13 Provide undo Users have a mechanism that enables them to undo the effects of an unintended keystroke or button press (B:THO93).

can require significant time and effort to recover from such unintentional actions.

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 13

5.6 Keyboard input configuration Although the guidelines in this section refer to "keyboard input", the source of such

physical keyboard. 5.6.1 Enable sequential entry of multiple keystrokes The system should enable users to lock or latch modifier keys (e.g., Shift, Control, entered sequentially rather than by simultaneously pressing multiple keys (B:VAN88, B:VAN92a, B:THO93).

Note: This allows users who have physical impairments a means to enter combination key commands (e.g., Ctrl-C, Alt-Ctrl-Del) by pressing one key at a time.

Example: StickyKeys (available on most major platforms). 5.6.2 Provide option for delay of key acceptance

before a keypress is accepted (B:VAN88, B:VAN92a, B:THO93)).

Note: This feature allows users who have limited coordination who may have down a longer period of time than unintended keypresses. This delay of are ignored.

ignores keypresses that occur before a specified delay. These are mutually exclusive because a user can press a key long enough to be accepted, but then strike another key before the inter-keypress delay time -- in which case the two

in conflict. Example: SlowKeys (available on most major platforms).

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 14

5.6.3 Provide option for inter-keypress acceptance delay The system should enable users to customize the delay between one keypress and acceptance of the next keypress (B:VAN88, B:VAN92a).

Note: This feature allows users to prevent a system from accepting inadvertent keypresses. The inter-keypress delay locks out acceptance for users who may

but then cause them to strike unintended keys (for example, hitting the same key 2 or 3 times in succession). cause them to strike keys.

delay has been set.

keypresses that occur before a specified delay. These are mutually exclusive because a user can press a key long enough to be accepted, but then strike another key before the inter-keypress delay time -- in which case the two

in conflict. Example: BounceKeys (available on most major platforms). 5.6.4 Provide customization of key repeat rate The system should enable users to customize the rate of key repeat. 5.6.5 Provide post-keypress acceptance delay of repeat onset

a key down after keypress acceptance before key repeat begins, including the option to turn off key repeat altogether (B:VAN92a, B:THO93).

Note: This prevents users whose reaction time may be slow from producing unwanted repeated characters by holding down a key long enough to unintentionally initiate key repeat.

Example: RepeatKeys (available on most major platforms).

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 15

5.6.6 Provide keyboard control of pointer functions The system should provide a keyboard alternative to standard pointing devices that B:THO93).

to more easily control pointing functions. Example: MouseKeys (available on most major platforms). 5.6.7 Notify about toggled key state The system should enable users to customize whether keyboard state information is presented auditorily as well as visually (B:VAN92a).

Lock". Note: Existing systems indicate a locked state with an up tone, and an unlocked

Example: ToggleKeys (available on most major platforms). 5.6.8 Inform user of keyboard access status

Example: Each keyboard accessibility feature has a tone sequence that plays when it is activated or deactivated.

Example: A dialog informs the user that they have activated a keyboard accessibility feature.

5.6.9 Discourage accidental activation It should not be easy for users to inadvertently turn on keyboard accessibility features.

Example: System software provides a master control panel that can defeat accessibility features.

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 16

5.6.10 Provide accelerators Software should provide keyboard accelerators for frequently used features (VAN92a, B:THO93).

Note: In many cases, not every feature can be or should be mapped to an accelerator. Choice of what features to map to accelerator keys may be made by determining which features would constitute a core set of frequent and useful functions if a user were restricted to only those features.

Note: Accelerators are especially important for users who interact only through a features to macros. Users who have disabilities benefit because they can reduce time consuming steps that would otherwise be required to activate accelerated features.

Example: User can press "Ctrl-P" to print. 5.6.11 Provide mnemonics

items.

Note: Systems may provide the option to turn on or off display of mnemonics. 5.6.12 Use de facto keyboard equivalents for accessibility features The following keyboard equivalents are de facto standards (B:VAN92a, B:MAC92, B:THO93, B:WIN95) whose use should be reserved for their current purpose:

5.6.13 Avoid conflicts with accessibility features Software should not use modifier keys such as shift, alt, ctrl, and option to activate any function without another key.

Example: Graphic software would not use the shift key alone as a spray-paint accelerator, as multiple presses of the shift key could activate StickyKeys.

Keyboard Mapping Used For 5 consecutive clicks of shift key On/ Off for StickyKeys Shift key held down 8 seconds On/Off for SlowKeys and RepeatKeys

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 17

5.6.14 Enable remapping of keyboard functions The system should enable users to remap accelerator and function keys (B:IBM88, THO93).

Example: A user who has a left arm and no right arm wishes to switch frequently used functions from the right to the left side of the keyboard.

5.6.15 Separate navigation and activation

an explicit activation key or key sequence.

Example: A user presses the Tab key to move from a button to a set of checkboxes. When the first checkbox acquires focus, it does not become activated. Activation requires a separate step, such as pressing the spacebar.

5.7 Display fonts 5.7.1 Enable font customization and legibility System software and application specific software should enable users to select among font sizes, styles, and colors as presented on the display. (B:VAN92a, B:THO93, B:BER95).

5.7.2 Adjust the scale and layout of objects as font size changes User interface objects should be scaled or have their layout adjusted as needed to account for changes in embedded or associated text size.

Example: As fonts grow, button and menu sizes grow to accommodate them. If they become large enough, the window increases in size to prevent buttons from clipping (overwriting) each other.

5.7.3 Provide access to information displayed in "virtual" screen regions If the scale becomes large enough to displace information from the visible portion of the screen, then there should be a mechanism for accessing that information.

displayed on physical screen. Example: Notify user that some information is shown on the screen.

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 18

5.8 Audio output Some guidelines in this section assume that software runs on systems whose hardware permits implementation of the given recommendations. This assumption will not always hold true, but guidelines are provided to maximize software accessibility in cases where the hardware supports the recommended behavior.

5.8.1 Allow user to set default volume level for audio output

for system audio capabilities (B: EIA96). 5.8.2 Enable non-speech audio to occur in specified frequency range

range (B: EIA96)

impairments 5.8.3 Provide specified frequency components for audio warnings and alerts Alerts and other auditory warnings should include at least 2 strong mid to low frequency Hz for the other component (B: EIA96) 5.8.4 Enable audio customization System software should enable users to customize attributes of audio output including, but not limited to frequency and volume (B:VAN94).

can produce. 5.8.5 Allow users to choose visual or audio alert

(B:VAN92a, B:THO93, B:BER95).

Example: For users who have chosen to receive auditory feedback, a beep is provided when an error message has been displayed or a footer message has been updated.

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 19

on a dialog box is provided in place of a warning tone. Example: Explanatory text is provided in place of a sequence of alert sounds. 5.8.6 Allow user to choose alternatives to audio content The system should allow users to choose to have graphical and/or other alternatives substituted for task-relevant audio output (B:VAN92a, B:VAN94, B:BER95).

Example: The operating system provides the "ShowSounds" feature, allowing (B:VAN92b).

can be displayed on systems providing closed caption support or displayed by braille devices through assistive software.

analog. 5.9 Graphics 5.9.1 Enable user to customize graphic attributes

Example: A stockbroker who has low-vision wishes to view a line graph of the to change the thickness and color of the line. Example: Attributes such as line, border, and shadow thickness can be changed by the user to better view bar charts, X-Y graphs, and state diagrams, but such changes would not affect meaning (e.g., it would not change the length of a temperature gauge unless the scale were lengthened proportionally).

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 20

5.9.2 Use text characters as text, not as drawing elements In graphical interfaces, text characters should be used as text only, not to draw lines, boxes or other graphical symbols (B:THO93).

Note: Characters used in this way can confuse screen reader users. Note: In a character based display or region, graphic characters may be used

when read sequentially by users with assistive software. 5.10 Color 5.10.1 Provide color palettes designed for people who have visual impairments

impairments (B:VAN94).

include color combinations that provide high contrast, and other palette choices that avoid the use of colors that may confuse users who have common forms of red-green color blindness, cataracts, macular degeneration, and other visual impairments (B:VAN92a).

5.10.2 Allow users to create color palettes System should allow users to create their own color palettes, including background and foreground colors.

5.10.3 Allow users to customize color coding Except in cases where warnings or alerts have been standardized for mission critical systems (e.g., red=network failure), the system should allow users to customize colors used to indicate selection, process, or object state/status.

Example: A user who cannot discriminate between red and green can set printer status colors to be blue for OK and yellow to indicate printer problems.

5.10.4 Provide alternatives to coding by hue Hue should not be the only attribute used to code information (B:VAN92a, B:THO93).

Example: Colors are selected for contrast differences so that they are

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 21

distinguishable on the basis of light/dark differences by users who cannot discriminate among different hues.

Example: If an indicator turns from green to red to show an error condition, then the user can also get text or audio information that indicates the error condition.

5.10.5 Honor color settings Software should use color palette settings customized by users

Example: Use colors defined by user selected system resources. Example: If a user chooses red as the color to represent a link, an embedded application should not override that setting to provide another color.

5.11 Errors and user notification 5.11.1 Place user notification in task-relevant location

useful to users of assistive technologies (B:THO93).

Example: Critical information is presented in text, not only as an image or graph.

informative (non-error) message appears in the bottom left of a window. The consistent position allows users who are viewing only part of the screen through

that makes clear the source and context in which the error occurred, so that a person reading from a braille display can easily determine the message context.

5.11.2 Allow task-relevant warning or error information to persist Error or warning information should persist or repeat for as long as it is relevant or until the user dismisses it (B:VAN92a, B:THO93, B:VAN94).

Note: Auditory warnings or errors need only repeat at intervals appropriate to the user's task, and do not need to persist continuously.

user presses a "Close" button.

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 22

5.12 On-line documentation and help 5.12.1 Provide redundancy in on-line documentation and help Information presented in pictures and graphics should be provided as descriptive text suitable for screen reading, printing, or braille conversion so that it can be read by an alternative method (B:VAN92a, B:VAN94, B:THO93).

Example: User can print text portion of on-line help and read text descriptions of any embedded graphics.

5.12.2 Provide generic help content Where context permits, help should be sufficiently generic to fit a variety of input/output modalities and user preferences.

Note: For contexts where operation of a specific device such as a mouse is required, a generic description may not be possible. However, such specific descriptions need only occur in help about using that device, not in all contexts.

Example: Instead of "double-click the document", write "Open the document".

Example: Describe how to perform tasks using as many different input/output modalities as are available (e.g., mouse, keyboard, voice etc.).

5.12.3 Provide on-line documentation and help on access features On-line help or documentation should provide information on access-related features.

Example: On-line help provides a section describing all features of interest for accessibility.

Example: On-line help explains keyboard-only use of the software. Example: On-line documentation describes how to customize font size. Example: On-line documentation describes which color palettes are good for redgreen color blindness.

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 23

5.12.4 Allow substitution and customization of on-line documentation and help access related help.

on the application is inserted into the help system for the system's existing word processor and spreadsheet, allowing the user to see speech control related help when looking up topics regarding other applications.

5.13 Customization of user preferences 5.13.1 Provide interface to customize user preferences User preferences should be customizable through a graphical user interface, not only through editing of configuration files.

5.13.2 Provide ability to switch preferences Users should be allowed to easily switch their user preferences among various default and alternative configurations.

Example: A user is able to quickly load a system default configuration on a computer currently using an alternative configuration for a user who has a disability.

5.13.3 Provide ability to use preferences across locations Users should be allowed to utilize their user preferences on any compatible system.

Note: Portability is important for users with disabilities because they may find a system difficult or impossible to use without the preferences set to meet their

Example: A user visiting a different building on the company network logs in and the system automatically locates and uses her personal preferences from the network.

Example: A user loads a preferences file from a floppy disk onto a new computer.

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 24

5.13.4 Enable customization of common interface elements Software should enable users to customize size and color of interface elements, widths, and window controls.

Example: Users can change the size and style of window title text. Example: Users can change the size, color and shape of a caret indicating text input focus.

5.13.5 Enable cursor customization Software should enable users to customize attributes of all cursors including, but not limited to size, shape, color, and blink rate.1 (B:VAN92a, B:VAN94)

Example: A user with low-vision can change a text cursor from non-blinking to blinking, and customize the size to be more readily visible given their visual capabilities.

Example: A user with low-vision and a color deficiency can change the thickness and color of the focus cursor so that they can more easily see the current input focus.

5.13.6 Enable pointer customization Software should enable users to customize pointer shape, size, and color (B:VAN92a, B:VAN94).

readily locate it. 5.14 Software control of pointing devices 5.14.1 Enable adjusting location of button functions

function (B:THO93).

1. Needs parameters (TBD) 4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 25

1 reading from left to right. use for each function based on his ability to reach them. 5.14.2 Enable multiple clicks with single button press and release System software should enable users to set buttons on pointing devices to perform the system software for common functions.

Example: A user with limited dexterity maps a trackball button to act as a doubleclick.

Example: A user with limited dexterity maps a trackball button to act as a quadruple-click to select a stream of text.

5.14.3 Enable delay of pointer button press acceptance

pressed and when the press is accepted.

Example: A user who has tremors sets a delay to prevent tremor induced unintentional presses from being accepted as intentional presses.

5.14.4 Enable adjust of multiple-click interval System software should enable users to adjust the interval required between clicks in a multiple click operation (B:THO93)

in a double-click intended to open a document. 5.14.5 Enable button hold with single button press and release System software should enable users to adjust button pressing so that they are not

button rather than pressing and holding. Example: Users have the option to "lock" single-clicks so that they are treated as continuous button presses, allowing them to swipe select across text without

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 26

holding down a button. mouse button. 5.14.6 Enable pointer speed and ratio adjustment System software should enable users to adjust the speed or ratio at which the pointer moves in response to a movement of the pointing device (B:THO93).

Example: Users may change the speed of pointer movement by setting an absolute speed or a ratio between movements of the pointing device and the

movement of the pointing device and the pointer to a 3:1 mapping). 5.14.7 Provide alternatives to chorded button and keypresses Software should provide a non-chorded alternative for any chorded button presses whether chorded presses are on the pointing device alone or are on the pointing device in combination with a keyboard keypress (B:THO93).

actions that may be difficult or impossible for users with motor impairments. 2 simultaneously, then it also can be performed using one mouse button (e.g., using one mouse button to display a menu providing the same function).

down a mouse button and dragging, then it is also possible to perform this task using menu operations (e.g., to select a menu operation called "copy").

5.15 Window appearance and behavior 5.15.1 Enable navigation directly to windows System software should enable users to use the keyboard to move focus directly to any window currently running.

among windows as quickly with a keyboard as they might with a pointing device. Example: By browsing a list of currently running windows, the user selects a window that receives focus.

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 27

5.15.2 Detect window location and attributes Information on the title, size, location, focus and other attributes of existing windows should be available to assistive software.

Example: There is a system variable that is readable by assistive software that is written whenever a window is created or moved. Using this information, screen magnification software, for example, can determine if a window is currently being viewing under magnification.

5.15.3 New window location

appear off of the physical screen or completely cover existing windows. 5.15.4 Always on top windows

even if the window on top does not have input focus.

Example: A user has a visually displayed keyboard on screen that may be

it is visible at all times. 5.15.5 Always on top windows: when to use Software should use always on top windows only where a window is required continuously in order for users to perform a task across windows or applications.

text across a variety of applications. be the top level window through which all other window are viewed. 5.15.6 Always on top windows: user control with multiple instances Where there is a conflict among multiple windows that are specified as "always on top", behavior altogether for any or all windows.

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 28

5.15.7 Enable choice of effect of implicit window focus on stacking order focus either automatically placed on top of all other windows or not change its stacking position (with the exception of an "always on top window", see above).

move the pointer among windows than to click on them to bring them to the top. 5.15.8 Provide unique window titles Every window should have a title not shared with any other window currently displayed, even if several windows provide multiple views of the same object.

Example: User views a document in a window titled "Draft", and then opens a second view of that same document. The second window is labeled as "Draft: 2".

5.16 Keyboard input focus 5.16.1 Provide focus cursor The system should provide a focus cursor that indicates current input focus.

and they need to know where their keyboard input and actions will take effect. 5.16.2 Provide keyboard navigation Input focus should be assignable to any control via keyboard input.

move through controls spatially can save time by allowing them, for example, to move to the day one week before directly above in a calendar grid rather than having to step through each day.

arrow key to move to the list below, and so on. 5.16.3 Provide navigation to logical groups of controls

order appropriate for the task and interface layout (B:VAN94).

navigation occurs may be the only order and grouping in which they can use

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 29

controls. appropriate group of buttons, followed by the next group, and so on in a task and conceptually appropriate order. Within each cluster of controls, the user moves among related controls by pressing the arrow key.

5.16.4 Retain input focus location When a window regains focus via keyboard navigation, then the object that had focus within that window when that window last had focus should regain focus.

multiple keystrokes (e.g., tab-tab-tab-tab) to return if focus is not retained. 5.16.5 Enable assistive technologies to track focus-related information Information on focus cursor attributes, location, and events should be available to assistive technologies (B:VAN94, B:WIN95).

Note: Screen magnification software typically provides a magnified region as a "viewport" around the current input focus. While users may be able to manually move this viewport, software usability is greatly enhanced when the assistive magnification software can track current input focus location, including the focus on controls (e.g., focus cursor) within an application.

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 30

Potential References

BER95 Bergman, E., Johnson, E. Towards Accessible Human-Computer Interaction, Advances in HCI, Volume 5, Ablex Publishing Corporation, 1995.

BLA92 Blattner, M. M., Glinert, E.P., Jorge, J.A., and Ormsby, G.R., Metawidgets: Towards a theory of multimodal interface design. Proceedings: COMPASAC 92, pp 115- 120, IEEE Press, 1992.

BRO89 Brown, C. Computer Access in Higher Education for Students with Disabilities, 2nd Edition. George Lithograph Company, San Francisco. 1989.

BRO92 Brown, C. Assistive Technology Computers and Persons with Disabilities, Communications of the ACM, pp 36-45, 35(5), 1992.

Users with Disabilities, Human Factors, pp 407-422, 32(4), 1990. CHU92 Church, G., and Glennen, S. The Handbook of Assistive Technology, Singular Publishing Group, Inc., San Diego, 1992.

Interface to the X Window System. The X Resource. O'Reilly and Associates, Inc. April, 1993.

EDW93 Edwards, A., Edwards, E., and Mynatt, E. Enabling Technology for Users with Special Needs, InterCHI '93 Tutorial, 1993.

EIA96, Resource Guide for Accessible Design of Consumer Electronics, EIA/EIF, 1996. ELK90 Elkind, J. The Incidence of Disabilities in the United States, Human Factors, pp 397-405, 32(4), 1990.

EME92 Emerson, M., Jameson, D., Pike, G., Schwerdtfeger, R., and Thatcher, J. Screen Reader/PM. IBM Thomas J. Watson Research Center, Yorktown Heights, NY, 1992.

tions of the ACM, pp 32-35, 35(5), 1992. GRI90 Griffith, D. Computer Access for Persons who are Blind or Visually Impaired: Human Factors Issues. Human Factors, pp 467-475, 32(4), 1990.

Disabilities: A Strategy for Alternate Access System Developers. December 1988. KAP92 Kaplan, D., DeWitt, J., Steyaert, M. Telecommunications and Persons with Disabilities: Laying the Foundation.World Institute on Disability. November, 1992.

1993 International Workshop on Intelligent User Interfaces. pp 243-246. Orlando, FL. New York: ACM Press 1993.

LAZ93 Lazzaro, Joseph J. Adaptive Technologies for Learning and Work Environments. American Library Association, Chicago and London, 1993

MAC92 Macintosh Human Interface Guidelines, Addison-Wesley, 1992.

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 31

MAN91 Managing Information Resources for Accessibility, U.S. General Services Accommodation, 1991. MCC94 McCormick, John A. Computers and the American's with Disabilities Act: A Manager's Guide. Windcrest, 1994.

MCM92 McMillan, W.W. Computing for Users with Special Needs and Models of Com

pp. 143-148. Addison Wesley, 1992. MYN94 Mynatt, E. Auditory Presentation of Graphical User Interfaces in Kramer, G.

ison-Wesley: Reading MA., 1994. NEW93 Newell, A.F., and Cairns, A. Designing for Extraordinary Users. Ergonomics in Design, October, 1993.

NIE93 Nielsen, J. Usability Engineering. Academic Press, Inc., San Diego. 1993. NSI93 NSI T1.232-1993 Operations, Administration, Maintenance, and Provisioning

Network (TMN). PER91 Perritt Jr., H.H. Americans with Disabilities Act Handbook, 2nd Edition. John Wiley and Sons, Inc., New York, 1991.

SAU91 Sauter, S.L., Schleifer, L.M., and Knutson, S.J.  Work Posture, Workstation Design, and Musculoskeletal Discomfort in a VDT Data Entry Task. Human Factors, pp 407-422, 33(2), 1991.

SCH93 Schmandt, C. Voice Communications with Computers, Conversational Systems. Van Nostrand Reinhold, New York, 1993.

THO93 Thoren, C. Nordic Guidelines for Computer Accessibility. 1993. VAN83 Vanderheiden, G.C., Curbcuts and Computers: Providing Access to Computers and Information Systems for Disabled Individuals. Keynote Speech at the Indiana Governor's Conference on the Handicapped, 1983.

VAN88 Vanderheiden, G.C., Considerations in the design of Computers and Operating Systems to increase their accessibility to People with Disabilities", Version 4.2, Trace Research & Development Center, 1988.

VAN90 Vanderheiden, G.C., Thirty-Something Million: Should They be Exceptions? Human Factors, 32(4), 383-396. 1990.

Trace Research and Development Center, Madison, Wisconsin, 1991. VAN92A Vanderheiden, G.C. Making Software more Accessible for People with Disabilities: Release 1.2. Trace Research and Development Center, Madison, Wisconsin, 1992.

VAN92B Vanderheiden, G.C. A Standard Approach for Full Visual Annotation of Audito

Research & Development Center, 1992.

4 June 1997 DRAFT: HFES/ANSI 200, Section 5 - editor: eric.bergman@sun.com 32

VAN94 Vanderheiden, G.C. Application Software Design Guidelines: Increasing the Accessibility of Application Software to People with Disabilities and Older Users", Version 1.1. Trace Research & Development Center, 1994.

WAL93 Walker, W.D., Novak, M.E., Tumblin, H.R., Vanderheiden, G.C. Making the X Window System Accessible to People with Disabilities. Proceedings: 7th Annual X Technical Conference. O'Reilly & Associates, 1993.

WIN95. The Windows Interface Guidelines for Software Design, Microsoft Press, Microsoft Corporation. 1995.

Received on Thursday, 14 August 1997 03:45:13 UTC