- From: Al Gilman <asgilman@iamdigex.net>
- Date: Sun, 16 Jul 2000 12:26:00 -0400
- To: "Marti" <marti47@MEDIAONE.NET>, "Web Content Accessibility Guidelines" <w3c-wai-gl@w3.org>
At 10:58 AM 2000-07-16 -0400, Marti wrote: >Greg's suggestion about "sensory modality" was good but leads us back to the >problem of needing to interpret the language (say that again in English >please). To say this in plainer, more universally accessible language it takes more words. [analysis or homework, not plain statement:] The situation that has to be avoided is the following: Some information is only available to the user and the user's software in a form or forms which depend on a unique sense: sight, hearing, touch, etc. The goal is that for all information the user has options as to what sense is used to receive the information at the Human-to-Computer interface. [Ditto for commands and actuation means.] So the key words are: - sense, as in sight, hearing, smell, etc. - dependency, a necessary requirement - unique, only one. I hope those ideas are sufficiently broadly understood so we can base our explanation on them. If this is true, then the statement of the principle could run something like the following: [attempt at broadly accessible statement a.k.a. plain English:] The data available to the user representing any particular information should not all be dependent on one sense (vision, hearing, etc.) for presentation to the user. The data used to represent information must either be compatible with client processing alternatives which present to different senses, or else data alternatives must be provided which can be presented to different senses. Note: where the data provided to the user depends on client processing to reach the respective senses discussed above, this processing must be readily available to the user. Al
Received on Sunday, 16 July 2000 12:22:14 UTC