- From: David MacDonald <david100@sympatico.ca>
- Date: Thu, 31 Mar 2016 19:31:32 -0400
- To: "Patrick H. Lauke" <redux@splintered.co.uk>, Henny Swan <hswan@paciellogroup.com>, "public-mobile-a11y-tf@w3.org" <public-mobile-a11y-tf@w3.org>
- Message-ID: <BLU436-SMTP17999FD54637406F0F4CEE2FE990@phx.gbl>
Here is where this SC came from. http://www.bbc.co.uk/guidelines/futuremedia/accessibility/mobile/focus/touch-events "Simple touch events must only be triggered when touch is removed from a control and not when first touched. This allows users to change their mind without be forced to commit to an action. Similarly, it allows users with disabilities to move their finger or stylus over items to locate the precise location without affecting action until the finger or stylus is removed." ===Regarding Type 1 Vs. Type 2 in Chris's email=== There has been some discussion about this. Gregg was the first to bring up that a hand tremour after contact could cause the user to loose focus on a viable target ... It is an important consideration. I think Jon is right in wanting to get more real data. Here's what we know so far: The researchers who presented at CSUN that Jon spoke with tested a number of users with hand tremours and other physical disabilities who specifically have the profile we are trying to address with this SC. They did not identify any issues with Either Type 1 or Type 2, but said theoretically they agreed with our current SC. I think both TYPE 1 and Type 2 could be the same person, who touches the screen two different times. Either a person with CP OR tremour geriatric could miss the target on touch if they loose control while approaching the screen, but also the next time they could loose control AFTER touching the screen. However, I propose that it might be easier to find your target once you are touching the screen as a support, than to accurately hit the target moving the hand through mid air. So I think the TYPE 2 problems are much more likely, and that may be why the BBC decided on that. I think its a very open Success Criteria, that gives developers lots of flexibility. Regarding leaving it to UAAG, that's an entirely different beast, and perhaps a different discussion. I'd say let the SC stand and let the public kick at it, and see if it falls over. We have the BBC as a precedent, so it won't look like its out of left field. Perhaps we should discuss this with Henny Swan. On Thu, Mar 31, 2016 at 6:17 PM, Patrick H. Lauke <redux@splintered.co.uk> wrote: > > > On 31/03/2016 21:13, Chris McMeeking wrote: > >> I would propose making this a Triple or Double A requirement, and having >> the success criterion read very similar to what it does now. However, >> say something about: Users should have the ability to define whether >> selection (onClick events in web speak) occurs on touch down, or on >> touch up. This would satisfy both users. It certainly satisfies Type 1 >> users better than the current requirement. >> > > Should that not be something left up to the user agent (so UAAG)? > Otherwise you're essentially requiring every site/app to provide some > settings dialog/screen. Activation should happen on click (to satisfy > situations where a user is not using touchscreen, but a keyboard or > keyboard-like interface which may not necessarily fire fake touch events - > see some of the cases in > http://patrickhlauke.github.io/touch/tests/results/#mobile-tablet-touchscreen-assistive-technology-events > and > http://patrickhlauke.github.io/touch/tests/results/#mobile-tablet-keyboard-mouse-events) > as a general approach. > > P > -- > Patrick H. Lauke > > www.splintered.co.uk | https://github.com/patrickhlauke > http://flickr.com/photos/redux/ | http://redux.deviantart.com > twitter: @patrick_h_lauke | skype: patrick_h_lauke > > >
Received on Thursday, 31 March 2016 23:32:04 UTC