Re: Rough draft of some success criteria for a extension guideline "Touch accessible"

Thanks for your input Patrick. I've always very much appreciated your
comments to WCAG over the years, especially the last draft of WCAG 2...

I'll try summarize your concern to ensure I understand it: You feel system
screen readers don't have the ability to map an alternative gesture in the
case where the author created a custom gesture that collides with the
screen reader's core gesture functioning. Given that the screen reader
gesture will override it, you think as far as we can go to ensure this
application is accessible is to require redundant functionality to custom
gestures in buttons etc... Do I have that right?

If so, perhaps we can require that authors do not choose custom gestures
that collide with system screen reader gestures???


Cheers,

David MacDonald



*Can**Adapt* *Solutions Inc.*

Tel:  613.235.4902

LinkedIn <http://www.linkedin.com/in/davidmacdonald100>

www.Can-Adapt.com



*  Adapting the web to all users*
*            Including those with disabilities*

If you are not the intended recipient, please review our privacy policy
<http://www.davidmacd.com/disclaimer.html>

On Tue, Aug 18, 2015 at 5:00 AM, Patrick H. Lauke <redux@splintered.co.uk>
wrote:

> On 18/08/2015 02:25, David MacDonald wrote:
> > Furthermore Detlev has a good start on the modified gesture issue
> >
> > 2.5.3 Modified Touch: When touch input behavior is modified by built-in
> > assistive technology, all functionality of the content is still operable
> > through touch gestures. (Level A)
>
> I'm still not sure if this is actually within the power of a web content
> developer to control.
>
> Let's take a concrete example: say I implement a swipe gesture detection
> on a page. For instance, this rough and ready
> http://patrickhlauke.github.io/touch/swipe/ - ignore the fact that
> currently this is completely inaccessible in many other ways, it's just
> something I've made up ages ago to explore gesture detection principles -
> let's assume the gesture detection is tied to a focusable control, and that
> swiping left, right, up, down triggers some form of behavior.
>
> Once AT is running on the touchscreen device, swipes are intercepted by
> the AT. No touchstart/touchmove/touchend JavaScript events are fired when
> the user swipes, as those gestures are now handled by the AT.
>
> For me as the web content developer, there is no mechanism to say to the
> AT that I do want to handle gestures (whereas on desktop, for
> keyboard-based interactions, I can add role="application").
>
> For the user, on iOS/VO there's a pass-through gesture (double-tap and
> hold, perform the actual gesture that will be passed to the page). When
> this is used, the page's JavaScript does receive
> touchstart/touchmove/touchend events as normal. However, there is no
> announcement, hint, or anything else that would lead a user to even think
> there's a possible extra gesture in the page, and that they should use the
> pass-through. On Android/TalkBack, Windows Phone/Narrator, there is no - to
> my knowledge - pass-through gesture or equivalent, so both the developer
> and the user can't do anything with gestures at all, and they're all
> intercepted by the AT.
>
> So, as a developer, I have no way to actually meeting this proposed SC,
> unless there's some very clear clarification that custom gestures are
> excluded somehow, which then defeats the intended purpose of the SC I think.
>
> The only realistic option, from a developer's point of view, is to keep my
> gestures (for non-AT touchscreen users), but to complement them / provide
> equivalent functionality with more traditional means (controls, buttons,
> etc that react to click events).
>
>
> P
> --
> Patrick H. Lauke
>
> www.splintered.co.uk | https://github.com/patrickhlauke
> http://flickr.com/photos/redux/ | http://redux.deviantart.com
> twitter: @patrick_h_lauke | skype: patrick_h_lauke
>
>
>

Received on Tuesday, 18 August 2015 13:51:32 UTC