RE: Guideline 2.6 text



> -----Original Message-----
> From: Detlev Fischer (TK) [mailto:detlev.fischer@testkreis.de]
> Sent: Wednesday, January 10, 2018 2:53 AM
> >
> That is the way device sensors *will* be used, and there is nothing wrong with
> that - as long as authors provide alternatives. That is why I have suggested (a
> while ago) the following text:
>
> "Provide alternatives for user input via device sensors."
[Jason] However, 2.1.1 already requires such alternatives - "all functionality of the content" could hardly be clearer in its generality.

The proposal currently in the draft calls for UI components corresponding to functionality operated by device or user motion, which isn't quite the same as requiring operability via a keyboard/keyboard interface. It doesn't require the UI components to be operable via a pointing device or touch input (one could create a UI component with only a keyboard interface and nevertheless conform), but supporting multiple input mechanisms is what I would expect authors to do, even though it isn't strictly required of them under the proposal. Also, the UI component would be "visible" and available to assistive technologies - not just a keyboard shortcut hidden away somewhere.

Thus, it seems there is some value to be gained by the proposal as currently formulated.


________________________________

This e-mail and any files transmitted with it may contain privileged or confidential information. It is solely for use by the individual for whom it is intended, even if addressed incorrectly. If you received this e-mail in error, please notify the sender; do not disclose, copy, distribute, or take any action in reliance on the contents of this information; and delete it from your system. Any other use of this e-mail is prohibited.


Thank you for your compliance.

________________________________

Received on Wednesday, 10 January 2018 14:07:46 UTC