RE: Rough draft of some success criteria for a extension guideline "Touch accessible"

Ø  If so, perhaps we can require that authors do not choose custom gestures that collide with system screen reader gestures???

IMO Patrick is not specifically talking about conflicting gestures but gestures other than the default tap/click gesture.  The problem we are running into is that there is no API to define actions on a web page that can be used in a device independent fashion.  Take the example of an ARIA slider on a web page.  There is no way create an action for decrease and one for increase that can be triggered by VO for example with the actions rotor when the user swipes up or down like there is in native apps.  ARIA currently addresses this by instructing authors to listen to up and down arrow – but when the VoiceOver actions rotor is enabled swiping up and down does not send up and down keystrokes – although this might be a temping solution it would involve mobile platform vendors agree on a set of actions to associate with keystrokes or allow the user to customize those keys sent through an action rotor.

Jonathan

--
Jonathan Avila
Chief Accessibility Officer
SSB BART Group
jon.avila@ssbbartgroup.com<mailto:jon.avila@ssbbartgroup.com>
Phone 703.637.8957
Follow us: Facebook<http://www.facebook.com/#!/ssbbartgroup> | Twitter<http://twitter.com/#!/SSBBARTGroup> | LinkedIn<http://www.linkedin.com/company/355266?trk=tyah> | Blog<http://www.ssbbartgroup.com/blog> | Newsletter<http://eepurl.com/O5DP>

From: David MacDonald [mailto:david100@sympatico.ca]
Sent: Tuesday, August 18, 2015 9:51 AM
To: Patrick H. Lauke
Cc: public-mobile-a11y-tf@w3.org
Subject: Re: Rough draft of some success criteria for a extension guideline "Touch accessible"

Thanks for your input Patrick. I've always very much appreciated your comments to WCAG over the years, especially the last draft of WCAG 2...
I'll try summarize your concern to ensure I understand it: You feel system screen readers don't have the ability to map an alternative gesture in the case where the author created a custom gesture that collides with the screen reader's core gesture functioning. Given that the screen reader gesture will override it, you think as far as we can go to ensure this application is accessible is to require redundant functionality to custom gestures in buttons etc... Do I have that right?

If so, perhaps we can require that authors do not choose custom gestures that collide with system screen reader gestures???


Cheers,

David MacDonald



CanAdapt Solutions Inc.

Tel:  613.235.4902

LinkedIn<http://www.linkedin.com/in/davidmacdonald100>

www.Can-Adapt.com<http://www.Can-Adapt.com>



  Adapting the web to all users
            Including those with disabilities

If you are not the intended recipient, please review our privacy policy<http://www.davidmacd.com/disclaimer.html>

On Tue, Aug 18, 2015 at 5:00 AM, Patrick H. Lauke <redux@splintered.co.uk<mailto:redux@splintered.co.uk>> wrote:
On 18/08/2015 02:25, David MacDonald wrote:
> Furthermore Detlev has a good start on the modified gesture issue
>
> 2.5.3 Modified Touch: When touch input behavior is modified by built-in
> assistive technology, all functionality of the content is still operable
> through touch gestures. (Level A)

I'm still not sure if this is actually within the power of a web content developer to control.

Let's take a concrete example: say I implement a swipe gesture detection on a page. For instance, this rough and ready http://patrickhlauke.github.io/touch/swipe/ - ignore the fact that currently this is completely inaccessible in many other ways, it's just something I've made up ages ago to explore gesture detection principles - let's assume the gesture detection is tied to a focusable control, and that swiping left, right, up, down triggers some form of behavior.

Once AT is running on the touchscreen device, swipes are intercepted by the AT. No touchstart/touchmove/touchend JavaScript events are fired when the user swipes, as those gestures are now handled by the AT.

For me as the web content developer, there is no mechanism to say to the AT that I do want to handle gestures (whereas on desktop, for keyboard-based interactions, I can add role="application").

For the user, on iOS/VO there's a pass-through gesture (double-tap and hold, perform the actual gesture that will be passed to the page). When this is used, the page's JavaScript does receive touchstart/touchmove/touchend events as normal. However, there is no announcement, hint, or anything else that would lead a user to even think there's a possible extra gesture in the page, and that they should use the pass-through. On Android/TalkBack, Windows Phone/Narrator, there is no - to my knowledge - pass-through gesture or equivalent, so both the developer and the user can't do anything with gestures at all, and they're all intercepted by the AT.

So, as a developer, I have no way to actually meeting this proposed SC, unless there's some very clear clarification that custom gestures are excluded somehow, which then defeats the intended purpose of the SC I think.

The only realistic option, from a developer's point of view, is to keep my gestures (for non-AT touchscreen users), but to complement them / provide equivalent functionality with more traditional means (controls, buttons, etc that react to click events).


P
--
Patrick H. Lauke

www.splintered.co.uk<http://www.splintered.co.uk> | https://github.com/patrickhlauke

http://flickr.com/photos/redux/ | http://redux.deviantart.com

twitter: @patrick_h_lauke | skype: patrick_h_lauke

Received on Tuesday, 18 August 2015 14:33:39 UTC