W3C home > Mailing lists > Public > w3c-wai-gl@w3.org > January to March 2018

Re: Finding examples which demonstrate "Concurrent Input Mechanisms"

From: Patrick H. Lauke <redux@splintered.co.uk>
Date: Wed, 14 Feb 2018 21:37:38 +0000
To: w3c-wai-gl@w3.org
Message-ID: <865ae4cf-d297-4991-fc83-6508faa6a250@splintered.co.uk>
On 14/02/2018 20:32, Chuck Adams wrote:
> I performed a variety of searches to find pages to test “Concurrent 
> Input Mechanisms”.  I first found pages that met the standard, but I 
> quickly found that the standard was met passively.

Indeed. At its simplest level, if a site only relies on high-level, 
input agnostic events (focus, blur, click) then it will work regardless 
of (standard) input mechanism, and it won't distinguish nor care whether 
or not a user switched between them.

Examples in the wild of large sites failing the concurrent input 
mechanisms clause are likely rare, because - particularly with more 
users on hybrid touch-enabled laptops - sites that fail are identified 
as having a defect / bugs are reported to them and fixed.

One of the rarest finds I made a while ago (April 2016) was the 
flickr.com site, which when using a device reporting the 
presence/capability of a touchscreen would simply ignore mouse 
functionality when zooming into a photograph. I documented this here 
https://www.youtube.com/watch?v=f_2GKsI9TQU (with apologies for the 
out-of-sync audio) and reported it to a contact at Flickr...and the bug 
was fixed in their live site literally within 2 weeks or so.


> I then tested the functionality of the page using Talkback.  My android 
> phone was my testing platform, and in the same session I enabled 
> Talkback and proceeded to use Talkback feedback and gestures to exercise 
> the functionality of the page.

Arguably, the use of assistive technologies for testing is not really 
necessary, and arguably may lead to unintended results (as AT often 
swallows/remaps inputs for its own controls - e.g. touch gestures are 
not passed to the page itself anymore, but consumed by 
TalkBack/VoiceOver for their own control unless the user executes a 
passthrough gesture to send actual touches/swipes/etc to the 
page/application itself).

> I then randomly switched input methods during my test.  I did find that 
> the page behaved as expected in any of the input methods described 
> above.  I did, however, find that there were some minor issues if 
> multiple input methods were mixed (for example, using Talkback 
> simultaneously with mouse and keyboard).  I encountered these issues 
> accidentally, and I believe that such a test is really outside the scope 
> of the standard.  I also believe that my issues were more related to the 
> platform and not the web page.


> I then switched platforms to my Surface Pro 4 tablet and my Toshiba X1 
> Tablet, and repeated my tests (without Talkback).

This would be my suggested test platform/setup. If a site does a naive 
"if touchscreen is present, only register touch events, otherwise listen 
for regular mouse or click events", it'll be obvious on a touch-enabled 
laptop in most cases.


> Mapquest:  Interaction with the web based application seems to fit quite 
> nicely in the standard, and in brief tests I was able to switch between 
> touch and then keyboard/mouse.  The issue I had with Mapquest is that 
> keyboard only usage was challenging, as there are many instances where I 
> was unable to determine which object had focus.  As such I concluded 
> that Mapquest was not a good site to demonstrate this standard, because 
> of concerns about how well other standards are met.
> Codepen:  This is a publicly available tool for testing code.  The issue 
> I had was that I was unable to determine what keyboard shortcuts and 
> combinations were necessary to move around the components and regions in 
> the tool.  Codepen may an adequate example page, but either I need to 
> learn how to use it effectively with keyboard only or it is too 
> dependent on the mouse.
> Google Earth:  This site was promising, and may require more 
> investigation..  There’s a lot of interactions that I was intuitively 
> able to figure out how to exercise with just keyboard, but there were 
> some functions I could not figure out.  Enough of it worked for me to 
> continue exploring it, but there were enough issues to inspire me to 
> look elsewhere for more promising sites (this was the first site I 
> evaluated).  As this one held the most promise, I’ll return to it to 
> continue testing.

I'd be careful about the overlap here with WCAG 2.0 2.1.1 Keyboard. The 
aim for concurrent input mechanisms is not to test whether or not the 
site works with a keyboard, but rather to check that it is possible for 
a user to, essentially, switch between the inputs they have available. 
i.e. if a user started off interacting with the mouse, and the site then 
STOPPED allowing any sort of keyboard interaction, that would be a 
failure of this SC. or is a site detects that a touchscreen is present, 
and STOPS working for mouse or keyboard users.

Patrick H. Lauke

www.splintered.co.uk | https://github.com/patrickhlauke
http://flickr.com/photos/redux/ | http://redux.deviantart.com
twitter: @patrick_h_lauke | skype: patrick_h_lauke
Received on Wednesday, 14 February 2018 21:38:44 UTC

This archive was generated by hypermail 2.4.0 : Thursday, 24 March 2022 21:08:22 UTC