Focus events on touch devices

I'm currently working on accessibility hooks for a canvas based web 
application
and running into some difficulty with Apple's iOS VoiceOver. Android 
currently provides
a shell, and/or a joystick/trackball to move the focus position, 
essentially using the tabindex
routines we're all comfortable with.

Apple made things a bit more useful, by capturing touch events and 
detecting whether there
are landmarks such as anchors and buttons, then setting a virtual focus 
on those areas.

Their technique works quite well, but it was not designed with [canvas] 
in mind,
as I believe it expects usemap.

Regardless of that: is Apple's technique something that should be 
discussed further?
It's very much like a touch-hover event. A touchstart event is not 
actually fired until the
user hovers over an element, then taps the screen.

I'm still experimenting, and looking for advice. Currently, overlaying 
<button> elements
on click-able areas of the canvas works, but it's a kludge.


-Charles

Received on Monday, 7 March 2011 19:48:46 UTC