Re: touchscreens and interactive whiteboards

Great to hear this is happening.

Here is the picture that I have gathered by listening.  Is this true, in
your experience?

   "For dealing with a walk-up member of the public with a touch-screen,
    there is only one class of pointer event, which one could call onTouch.
    The taps etc. distinctions that people can train themselves to exploit
    with equipment they use on a continuing basis do not apply in this sort
    of a casual contact."

The applications have to be designed to deal with this.  No right button,
no doubleClick.

Whether the response to onTouch is 'activate' or 'focus' is a matter of raging
debate as regards to accessKey and may indeed need to be a mode adjustment
for people with different HCI capabilities.

Matt, are there available-on-the-Web examples of the class of applications
that you will be serving through your system?  The matter of single-authoring
for something that may be presented on touch screen or VoiceXML (on user 
option)
is cutting edge, IMHO.

By the way, we have the U.S. Federal community pointed at XForms-in-SVG for the
screen, as an option to XHTML.  You might think about that.

The point is that the touchable screen on a PDA/phone hybrid device acts like
the public kiosk in this regard.  Applications will be designed to be morphable
into form compatible with these G3 devices.  Merging the design rules from 
public
kiosks and mobile personal devices will increase the market for 
single-authoring
covering either.

See also

SVG and SMIL support in Access mobile phones
http://www.access.co.jp/english/press/030307.html

They have generated voice dialogs from generic controllable-appliance 
specification
documents at CMU.

More blather at

  http://lists.w3.org/Archives/Public/wai-xtech/2002Jul/0015.html


Al

At 04:03 AM 2003-04-09, Matthew Smith wrote:

>Jonathan, All
>
>>Does anyone know how touchscreens and interactive whiteboards activate 
>>onclick and onmouseover events?
>
>I have had some experience developing a kiosk application with Microtouch 
>(now 3M Touch Systems) touch screens.  Since I knew HTML and didn't have 
>the time to learn another <abbr title="User Interface">UI</abbr>, I used 
>Mozilla with all the window decoration and controls turned off, running 
>under a minimal XFree86 installation.
>
>A light touch corresponded activated onMouseOver events and a sharp tap 
>acted as a click.  However, the system was anyting but Accessible and has 
>left me slighly wary of ( touch screens + Web browser ) in an Assistive 
>Technologies context, at least with casual use.
>
>Although the tests were with a technically able-bodied sample, many people 
>had a job using the system, at least until they had developed the 
>"knack".  If I worked my events using onMouseOver, they would tap too hard 
>and the event would be ignored; operating a standard link (no JavaScript 
>handlers) soon found the people who didn't tap hard enough.
>
>Touch screens can also be problematic for anyone with motor/coordination 
>problems.
>
>My final solution to the problem with my kiosks (on which I am still 
>working), is to go with the touchscreen (whichever event handler you 
>choose), but to pass the data as XML, translating to XHTML for the screen 
>with an option to translate to SABLE (a voice XML) for speech output.  The 
>alternate input device will be either the EZ interface or something 
>similar, probably providing accesskey events.
>
>Hope this helps.
>
>Cheers
>
>M
>
>
>
>
>--
>Matthew Smith
>IT Consultant - KBC, South Australia
>KBC Web Site    http://www.kbc.net.au
>PGP Public Key  http://gpg.mss.cx

Received on Wednesday, 9 April 2003 11:53:36 UTC