W3C home > Mailing lists > Public > www-dom@w3.org > January to March 2011

Focus events on touch devices

From: Charles Pritchard <chuck@jumis.com>
Date: Mon, 07 Mar 2011 11:48:15 -0800
Message-ID: <4D75367F.2010205@jumis.com>
To: "www-dom@w3.org" <www-dom@w3.org>
I'm currently working on accessibility hooks for a canvas based web 
and running into some difficulty with Apple's iOS VoiceOver. Android 
currently provides
a shell, and/or a joystick/trackball to move the focus position, 
essentially using the tabindex
routines we're all comfortable with.

Apple made things a bit more useful, by capturing touch events and 
detecting whether there
are landmarks such as anchors and buttons, then setting a virtual focus 
on those areas.

Their technique works quite well, but it was not designed with [canvas] 
in mind,
as I believe it expects usemap.

Regardless of that: is Apple's technique something that should be 
discussed further?
It's very much like a touch-hover event. A touchstart event is not 
actually fired until the
user hovers over an element, then taps the screen.

I'm still experimenting, and looking for advice. Currently, overlaying 
<button> elements
on click-able areas of the canvas works, but it's a kludge.

Received on Monday, 7 March 2011 19:48:46 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 20 October 2015 10:46:17 UTC