Re: direct and spatial mapping to functionalities

Using the "standard" keyboard API is not sufficient, for the reasons Marja
has pointed out. But it is necessary, since it is the use of the standard
API, rather than the keyboard itself, that the developers of alternative
input technology rely on (except those who are emulating mouse input of
course).

However I think that is a side issue and could be handled as an editorial
amendment (if it isn't - I have not checked the 4 October draft yet).

Charles McCN

On Mon, 4 Oct 1999, Marja-Riitta Koivunen wrote:

  At 11:29 AM 10/3/99 -0500, thatch@us.ibm.com wrote:
  >
  >
  >MK: Does everybody else understand this perfectly and agree that the current
  >wording reflects it? Then I will stop.
  >
  >JT: I am not going to ask you to stop, but even if 2.1 is flawed
  >(and I don't think it is) it should remain as is. The wording is
  >unequivocal and clear. It is, in my opinion, the most important
  >checkpoint for access to software for multiple disabilities. Water
  >it down or generalize it and it will loose its focus and force.
  
  The point I'm trying to make is that asking for keyboard access is too
  general. It does not say (not for me anyway) that you need direct mapping
  from keyboard keys to functions (so that no emulation of pointing graphics
  objects with keyboard keys will do).
  
  Maybe I'm wrong I don't think using keyboard API helps. Isn't keyboard API
  just a way to get the key codes from the keyboard to the user agent. The
  user agent then interprets them and according to our guidelines it can now
  interpret them as moving cursor and clicking and I think it does not help
  too much a user having difficulties with mouse control. It would be better
  to skip the pointing part and activate functions immediately.
  
  Another thing is that in my opinion there could be (maybe lower) level
  checkpoint for spatial mapping too, or at least for providing some memory
  aid of available functions, so that users don't have to rely so much on
  their memory.
  
  Marja
  
  >2.1 By default and without additional customization, ensure that all
  >functionalities offered by the user agent are accessible using the keyboard.
  >
  >Jim Thatcher
  >IBM Special Needs Systems
  >www.ibm.com/sns
  >HPR Documentation page: http://www.austin.ibm.com/sns/hprdoc.html
  >thatch@us.ibm.com
  >(512)838-0432
  >
  >
  >Marja-Riitta Koivunen <marja@w3.org> on 09/29/99 10:56:43 AM
  >
  >To:   Ian Jacobs <ij@w3.org>
  >cc:   ian@w3.org, w3c-wai-ua@w3.org
  >Subject:  Re: direct and spatial mapping to functionalities
  >
  >
  >
  >
  >At 11:33 AM 9/29/99 -0400, Ian Jacobs wrote:
  >>Marja-Riitta Koivunen wrote:
  >>>
  >>> While thinking about conformance I was looking the guidelines and
  >>> checkpoints again. I still don't like the word keyboard in guideline 2. I
  >>> also think keyboard access is not what we want to say in  many checkpoints
  >>> e.g. in
  >>>
  >>> 2.1 By default and without additional customization, ensure that all
  >>> functionalities offered by the user agent are accessible using the
  >keyboard.
  >>>
  >>> So you could use the keyboard arrow keys to point and some other key to
  >>> select and still conform? Or what about my laptop keyboard with a finger
  >>> mouse built into it?
  >>>
  >>> I think we want to say something about offering direct mapping from input
  >>> device keys to the functionalities as opposite to spatial mapping with
  >>> pointing and graphical objects. In the first case we usually have many
  keys
  >>> or key combinations that the user needs to remember but no need to
  point or
  >>> see. In the latter case we need to remember just few keys and some way to
  >>> point in 2D (or 3D). If we can present the activating of functionalities
  >>> with graphical objects or by using force feedback it often helps memory
  but
  >>> it is slower to get to the functions.
  >>>
  >>> I think both mappings are important. The point&click UI with explorable
  >>> memory aid (e.g. graphical  objects, sound map, force feedback map) helps
  >>> cognitively disabled (and everyone with human memory) the direct mapping
  >>> helps motorically disabled because some key or morse code etc. can be
  >>> mapped directly to the function without need to go through the spatially
  >>> located object.
  >>
  >>There is an analogy with serial access to links (which provides
  >>context as-you-go) and direct access (which is faster, but requires
  >>more experience). Explaining the utility of both for access
  >>to UA functionality would be useful (we already do so for navigation).
  >>
  >>However, I think your abstraction overlooks the need that motivated
  >>this Guideline: assistive technology today relies on software using
  >>the standard OS keyboard API (please correct me if I'm wrong). This
  >>Guideline is less abstract than others since it addresses today's
  >>technology and today's requirements. Device-independence captures
  >>the principle while talking about the keyboard API captures today's
  >>need.
  >
  >Guideline is not a checkpoint, so it is not what people check, it does not
  >even have priorities. It is put higher up because it is important, still
  >this guideline does not need to be conformed always. So I'm confuced.
  >
  >Then I look the actual checkspoint under the guideline, such as
  >
  >2.1 By default and without additional customization, ensure that all
  >> functionalities offered by the user agent are accessible using the
  keyboard.
  >>
  >> So you could use the keyboard arrow keys to point and some other key to
  >> select and still conform? Or what about my laptop keyboard with a finger
  >> mouse built into it?
  >
  >And I'm even more confused. Here I agree we should mention keyboard, but
  >when we have mentioned it, it seems not to help. Can I use some of my
  >keyboard keys (or even my keyboard mouse for pointing) and conform to this?
  >I don't think this is what we want.
  >
  >Does everybody else understand this perfectly and agree that the current
  >wording reflects it? Then I will stop.
  >
  >Marja
  >
  >
  >>I am not yet convinced (but still open!) that presenting the
  >>Guideline as requiring direct v. serial access to UA functionality
  >>will adequately address the requirement of today's technology.
  >>
  >>Please let me know if my comments reflect an understanding of
  >>your suggestion.
  >>
  >>Thank you,
  >>
  >> - Ian
  >>
  >>
  >>> A separate thing is then how to present all this. If the user can see she
  >>> can have memory aid on the screen (or even paper) also for directly mapped
  >>> keyboard events, if she cannot she needs to rely more on memory. On the
  >>> other hand she may use spatial mapping and exhaustive spatial search with
  >>> sound or force feedback to help her memory. The graphical object model
  >>> provides the memory aid naturally but can also be badly designed.
  >>>
  >>> Marja
  >>
  >>--
  >>Ian Jacobs (jacobs@w3.org)   http://www.w3.org/People/Jacobs
  >>Tel/Fax:                     +1 212 684-1814
  >>Cell:                        +1 917 450-8783
  >>
  >
  >
  >
  >
  

--Charles McCathieNevile            mailto:charles@w3.org
phone: +1 617 258 0992   http://www.w3.org/People/Charles
W3C Web Accessibility Initiative    http://www.w3.org/WAI
MIT/LCS  -  545 Technology sq., Cambridge MA, 02139,  USA

Received on Monday, 4 October 1999 11:50:09 UTC