- From: Matthew Smith <matt@kbc.net.au>
- Date: Wed, 09 Apr 2003 17:33:57 +0930
- To: Jonathan Chetwynd <j.chetwynd@btinternet.com>
- CC: WAI Interest Group <w3c-wai-ig@w3.org>
Jonathan, All > Does anyone know how touchscreens and interactive whiteboards activate > onclick and onmouseover events? I have had some experience developing a kiosk application with Microtouch (now 3M Touch Systems) touch screens. Since I knew HTML and didn't have the time to learn another <abbr title="User Interface">UI</abbr>, I used Mozilla with all the window decoration and controls turned off, running under a minimal XFree86 installation. A light touch corresponded activated onMouseOver events and a sharp tap acted as a click. However, the system was anyting but Accessible and has left me slighly wary of ( touch screens + Web browser ) in an Assistive Technologies context, at least with casual use. Although the tests were with a technically able-bodied sample, many people had a job using the system, at least until they had developed the "knack". If I worked my events using onMouseOver, they would tap too hard and the event would be ignored; operating a standard link (no JavaScript handlers) soon found the people who didn't tap hard enough. Touch screens can also be problematic for anyone with motor/coordination problems. My final solution to the problem with my kiosks (on which I am still working), is to go with the touchscreen (whichever event handler you choose), but to pass the data as XML, translating to XHTML for the screen with an option to translate to SABLE (a voice XML) for speech output. The alternate input device will be either the EZ interface or something similar, probably providing accesskey events. Hope this helps. Cheers M -- Matthew Smith IT Consultant - KBC, South Australia KBC Web Site http://www.kbc.net.au PGP Public Key http://gpg.mss.cx
Received on Wednesday, 9 April 2003 04:04:16 UTC