RE: Rough draft of some success criteria for a extension guideline "Touch accessible"

[Gregg wrote]

Ø  then the screen reader could use that method for achieving the function - rather than needing to worry about knowing or being able to perform the new gesture

In practice how would this work though?  Mobile screen readers don’t provide ways to map keystrokes to gestures (not accessibility supported) and it is unreasonable to require a screen reader users to carry around a keyboard to have comparable access to people who don’t use screen readers.

Jonathan

--
Jonathan Avila
Chief Accessibility Officer
SSB BART Group
jon.avila@ssbbartgroup.com<mailto:jon.avila@ssbbartgroup.com>
Phone 703.637.8957
Follow us: Facebook<http://www.facebook.com/#!/ssbbartgroup> | Twitter<http://twitter.com/#!/SSBBARTGroup> | LinkedIn<http://www.linkedin.com/company/355266?trk=tyah> | Blog<http://www.ssbbartgroup.com/blog> | Newsletter<http://eepurl.com/O5DP>

From: Gregg Vanderheiden [mailto:gregg@raisingthefloor.org]
Sent: Tuesday, August 18, 2015 11:13 AM
To: David MacDonald
Cc: Patrick H. Lauke; public-mobile-a11y-tf@w3.org
Subject: Re: Rough draft of some success criteria for a extension guideline "Touch accessible"

This problem should not be a problem if WCAG   “keyboard interface”  provision is followed. .

If all functionality can be achieved from the keyboard interface - then the screen reader could use that method for achieving the function - rather than needing to worry about knowing or being able to perform the new gesture

No?

gregg

----------------------------------
Gregg Vanderheiden
gregg@raisingthefloor.org<mailto:gregg@raisingthefloor.org>



On Aug 18, 2015, at 8:50 AM, David MacDonald <david100@sympatico.ca<mailto:david100@sympatico.ca>> wrote:

Thanks for your input Patrick. I've always very much appreciated your comments to WCAG over the years, especially the last draft of WCAG 2...
I'll try summarize your concern to ensure I understand it: You feel system screen readers don't have the ability to map an alternative gesture in the case where the author created a custom gesture that collides with the screen reader's core gesture functioning. Given that the screen reader gesture will override it, you think as far as we can go to ensure this application is accessible is to require redundant functionality to custom gestures in buttons etc... Do I have that right?

If so, perhaps we can require that authors do not choose custom gestures that collide with system screen reader gestures???

Cheers,
David MacDonald

CanAdapt Solutions Inc.
Tel:  613.235.4902
LinkedIn<http://www.linkedin.com/in/davidmacdonald100>
www.Can-Adapt.com<http://www.can-adapt.com/>

  Adapting the web to all users
            Including those with disabilities

If you are not the intended recipient, please review our privacy policy<http://www.davidmacd.com/disclaimer.html>

On Tue, Aug 18, 2015 at 5:00 AM, Patrick H. Lauke <redux@splintered.co.uk<mailto:redux@splintered.co.uk>> wrote:
On 18/08/2015 02:25, David MacDonald wrote:
> Furthermore Detlev has a good start on the modified gesture issue
>
> 2.5.3 Modified Touch: When touch input behavior is modified by built-in
> assistive technology, all functionality of the content is still operable
> through touch gestures. (Level A)

I'm still not sure if this is actually within the power of a web content developer to control.

Let's take a concrete example: say I implement a swipe gesture detection on a page. For instance, this rough and ready http://patrickhlauke.github.io/touch/swipe/ - ignore the fact that currently this is completely inaccessible in many other ways, it's just something I've made up ages ago to explore gesture detection principles - let's assume the gesture detection is tied to a focusable control, and that swiping left, right, up, down triggers some form of behavior.

Once AT is running on the touchscreen device, swipes are intercepted by the AT. No touchstart/touchmove/touchend JavaScript events are fired when the user swipes, as those gestures are now handled by the AT.

For me as the web content developer, there is no mechanism to say to the AT that I do want to handle gestures (whereas on desktop, for keyboard-based interactions, I can add role="application").

For the user, on iOS/VO there's a pass-through gesture (double-tap and hold, perform the actual gesture that will be passed to the page). When this is used, the page's JavaScript does receive touchstart/touchmove/touchend events as normal. However, there is no announcement, hint, or anything else that would lead a user to even think there's a possible extra gesture in the page, and that they should use the pass-through. On Android/TalkBack, Windows Phone/Narrator, there is no - to my knowledge - pass-through gesture or equivalent, so both the developer and the user can't do anything with gestures at all, and they're all intercepted by the AT.

So, as a developer, I have no way to actually meeting this proposed SC, unless there's some very clear clarification that custom gestures are excluded somehow, which then defeats the intended purpose of the SC I think.

The only realistic option, from a developer's point of view, is to keep my gestures (for non-AT touchscreen users), but to complement them / provide equivalent functionality with more traditional means (controls, buttons, etc that react to click events).


P
--
Patrick H. Lauke

www.splintered.co.uk<http://www.splintered.co.uk/> | https://github.com/patrickhlauke

http://flickr.com/photos/redux/ | http://redux.deviantart.com<http://redux.deviantart.com/>
twitter: @patrick_h_lauke | skype: patrick_h_lauke

Received on Tuesday, 18 August 2015 15:18:18 UTC