Re: WebXR Gamepads Module accessibility queries

Thanks Matthew, and also to Klaus for insights.

Matthew Tylee Atkinson wrote on 12/02/2020 19:20:
> Bigger picture, I think it would be worth pursuing accessible inputs 
> at the user agent level. For example, a user agent (or extension, 
> assuming appropriate APIs for this exist) could provide an option to 
> expose a new WebXR input device based on a clicker input combined with 
> gaze direction, using either the headset pose or realtime eye tracking 
> if available at the platform level. When enabled, this could either be 
> an additional XR input device, or could replace the usual platform 
> controllers. Doing this at the user agent level would have the 
> potential to make applications accessible even if they don't 
> themselves implement accessible input methods.

+1 This is a very interesting idea. It could form some bedrock of 
support for legacy user agents also, for example  older type Switch 
technology that is used by people with limited physical mobility that 
just may not support any new APIs, but they map to keyboard inputs by 
default. You can see an example here in 'One thumb to rule them all'.

https://www.youtube.com/watch?v=2BhHwk9qSvI

In this video there is a modified switch used in conjunction with a OS 
level scanning application. The user can play Quake, write his book etc. 
This is one of my favourite Assistive Technology videos.

> As a side benefit, this could also improve privacy - for the example 
> of realtime eye tracking, I think it would be much less invasive if 
> this data stays within the user agent and is just exposed indirectly 
> as a XR controller, as opposed to making eye tracking data directly 
> available to sites through a web API.

+1

Thanks

Josh
-- 
Emerging Web Technology Specialist/Accessibility (WAI/W3C)

Received on Thursday, 13 February 2020 10:38:43 UTC