Re: Rough draft of some success criteria for a extension guideline "Touch accessible"

I agree...

I think WCAG 2.1.1 already covers the need for keyboard use (without
mouseKeys)... this would be an add on for the mobile space where keyboard
use is quite uncommon. We could maybe plug the hole so the pass through
gesture is not relied on by the author the same way we do in 2.1.1 not
relying on MouseKeys..

2.5.1 Touch: For pages and applications that support touch, all
functionality of the content is operable through touch gestures with and
without system assistive technology activated, without relying on pass
through gestures on the system (Level A)

Cheers,

David MacDonald



*Can**Adapt* *Solutions Inc.*

Tel:  613.235.4902

LinkedIn <http://www.linkedin.com/in/davidmacdonald100>

www.Can-Adapt.com



*  Adapting the web to all users*
*            Including those with disabilities*

If you are not the intended recipient, please review our privacy policy
<http://www.davidmacd.com/disclaimer.html>

On Mon, Aug 17, 2015 at 7:48 AM, Patrick H. Lauke <redux@splintered.co.uk>
wrote:

> On 17/08/2015 12:27, David MacDonald wrote:
>
>  >Patrick says: As it's not possible to recognise gestures when
>> VoiceOver is enabled, as VO intercepts gestures for its own purposes
>> (similar to how desktop AT intercept key presses) unless the user
>> explicitly uses a pass-through gesture, does this imply that interfaces
>> need to be made to also work just with an activation/double-tap ? i.e.,
>> does double-tap count in this context as a "gesture"? If not, it's not
>> technically possible for web pages to force pass-through (no equivalent
>> to role="application" for desktop/keyboard handling)...
>>
>> David: VO uses gestures for its own purposes and then adds gestures to
>> substitute for those it replaced i.e., VO 3 finger swipe= 1 finger
>> swipe. I'm suggesting that everything that can be accomplished with VO
>> off with gestures can be accomplished with VO on.
>>
>
> Not completely, though. If I build my own gesture recognition from basic
> principles (tracking the various touchstart/touchmove/touchend events), the
> only way that gesture can be passed on to the JS when VO is activated is if
> the user performs a pass-through gesture, followed by the actual gesture
> I'm detecting via JS. Technically, this means that yes, even VO users can
> make any arbitrary gesture detected via JS, but in practice, it's - in my
> mind - more akin to mouse-keys (in that yes, a keyboard user can nominally
> use any mouse-specific interface by using mouse keys on their keyboard,
> just as a touch-AT user can perform any custom gesture...but it's more of a
> last resort, rather than standard operation).
>
> Also, not sure if Android/TalkBack, Windows Mobile/Narrator have these
> sorts of pass-through gestures (even for iOS/VO, it's badly documented...no
> mention of it that I could find on any official Apple sites).
>
> In short, to me this still makes it lean more towards providing all
> functionality in other, more traditional ways (which would then also work
> for mobile/tablet users with an external keyboard/keyboard-like interface).
> Gestures can be like shortcuts for touch users, but should not replace more
> traditional buttons/widgets, IMHO. This may be a user setting perhaps?
> Choose if the interface should just rely on touch gestures, or provide
> additional focusable/actionable controls?
>
>
> P
> --
> Patrick H. Lauke
>
> www.splintered.co.uk | https://github.com/patrickhlauke
> http://flickr.com/photos/redux/ | http://redux.deviantart.com
> twitter: @patrick_h_lauke | skype: patrick_h_lauke
>
>
>

Received on Monday, 17 August 2015 15:36:51 UTC