- From: Charles McCathieNevile <charles@w3.org>
- Date: Mon, 8 Feb 1999 14:39:46 -0500 (EST)
- To: Jon Gunderson <jongund@staff.uiuc.edu>
- cc: w3c-wai-ua@w3.org
As a preliminary comment, most of these things are covered under the general principle 'provide device-independent access to all functionality of the user agent' (which seems a lot like the current checkpoint 3.1.1) The difference is that this proposal is splitting out particular functions and requiring them (of particular browsers, in the current incarnation). The issue of whether to implement the w3c recommendation for DOM is separate. Charles McCathieNevile On Mon, 8 Feb 1999, Jon Gunderson wrote: Based on feedback from the group I think our current checkpoints related to assistive technology compatibility need to be reconsidered for the following reasons: 1. The current techniques for comaptibility read more like techniques than statements of assistive technology needs. 2. DOM can only be considered as part of a solution for Desktop user agents for the following reasons: a. DOM does not provide any information or the emulation of controls for the other parts of the user interface (i.e. controls, menus, staus lines, dialpg boxes). This information needs to come from a non-DOM source. DOM will never provide information or control about these parts of the user interface. b. DOM does not have a defined interoperable interface for use by external programs. Some group members say this is not a major issue (including myself at times), but it is potentially a weak link if user agents running on the same plateform use different methods to expose DOM. Assistive technology would then need to "know" where to look. Also DOM does not have any conventions for more simultaneous access to the DOM. How would DOM resolve manipulation requests from both the user agent and the assistive technology? How would the user agent tell the AT that it changed something? c. The use of DOM would require Assistive Technolgy to sub class the user agent as a special technology and some assistive technology companies may find this requirement to restraining as the primary mechanism for accessibility, escpecially on MS-Windows plateforms that have accessibility models based on active accessibility. Denis Anson made a good point. If push this type of technique, it means that user with disabilities will need to wait for AT developers to provide access to new implementations of DOM. More general techniques like active accessibility, offer improved timeliness to new releases of user agents. So I would like to suggest five checkpoints for people to think about, criticize, modify and/or comment: ** The following checkpoints ae based on the assistive technologies point of view ** Checkpoint 6.2.1 [Priority 1} Allow assistive technology to access information about the current user interface controls (windows, menus, toolbars, status bars, dialog boxes). Primary techniques: Accessibility APIs or use of operating system standard controls. Checkpoint 6.2.2 [Priority 1} Allow assistive technology to simulate the selection and activation of user interface and document controls (windows, menus, toolbars, status bars, dialog boxes). Primary techniques: Accessibility APIs or use of operating system standard controls. Checkpoint 6.2.3 [Priority 1] Allow assistive technologies to access information about the current information being rendered by the user agent. Primary techniques: Accessibility APIs that provide information on document rendering and/or DOM. Checkpoint 6.2.4 [Priority 1] Allow accessibility features (accessibility flags and interfaces. ) of the operating system to provide alternative rendering information and user interfaces for the user agent. Checkpoint 6.2.5 [Priority 2] Allow assistive technology to change the rendering of document information on the user agent. Rationale: In some cases it maybe useful for the assistive technology to change the rendering of a document. For example for a person with certain types of visual learning disabilities it maybe important to simplify the rendering of the document and allow the person to use the mouse to point at objects and have the contents of the object spoken to them. It could also be used for table linearization if the assistive technology felt that was the best way for them to provide access to table information. Jon Gunderson, Ph.D., ATP Coordinator of Assistive Communication and Information Technology Division of Rehabilitation - Education Services University of Illinois at Urbana/Champaign 1207 S. Oak Street Champaign, IL 61820 Voice: 217-244-5870 Fax: 217-333-0248 E-mail: jongund@uiuc.edu WWW: http://www.staff.uiuc.edu/~jongund http://www.als.uiuc.edu/InfoTechAccess --Charles McCathieNevile mailto:charles@w3.org phone: +1 617 258 0992 http://purl.oclc.org/net/charles W3C Web Accessibility Initiative http://www.w3.org/WAI MIT/LCS - 545 Technology sq., Cambridge MA, 02139, USA
Received on Monday, 8 February 1999 14:39:49 UTC