- From: Klaus Weidner <klausw@google.com>
- Date: Tue, 11 Feb 2020 10:18:45 -0800
- To: Matthew Tylee Atkinson <matkinson@paciellogroup.com>
- Cc: public-immersive-web-wg@w3.org, W3C WAI Accessible Platform Architectures <public-apa@w3.org>
- Message-ID: <CAFU2V81F7paNs5PnYzhDZHu5qJLtyPWU7zMo_kMmQQQDzxaMyA@mail.gmail.com>
My (very unofficial) understanding is that the WebXR Gamepads Module is specifically focused on providing additional information for existing XR controllers, adding information about the controller's buttons and axes that go beyond the XR events for primary select and squeeze. From this point of view, I think it wouldn't really be inside the scope of that module to support non-XR gamepads as XR controllers. A gamepad device would still be available through the non-XR Gamepad API, and as far as I know application frameworks such as A-Frame can easily support them as inputs within XR sessions, but specific support would be up to individual applications. Separately, I think there's precedent for having simple clicker inputs for XR systems, for example the Cardboard button generates XR "select" events in Chrome on Android, using the core WebXR API for input and not the WebXR Gamepad Module. If I remember correctly Hololens v1 came with an untracked clicker button, but I don't know if this is currently supported for WebXR. These cases are generally limited to inputs considered part of the XR system. Bigger picture, I think it would be worth pursuing accessible inputs at the user agent level. For example, a user agent (or extension, assuming appropriate APIs for this exist) could provide an option to expose a new WebXR input device based on a clicker input combined with gaze direction, using either the headset pose or realtime eye tracking if available at the platform level. When enabled, this could either be an additional XR input device, or could replace the usual platform controllers. Doing this at the user agent level would have the potential to make applications accessible even if they don't themselves implement accessible input methods. As a side benefit, this could also improve privacy - for the example of realtime eye tracking, I think it would be much less invasive if this data stays within the user agent and is just exposed indirectly as a XR controller, as opposed to making eye tracking data directly available to sites through a web API. I think that UA-synthesized or modified XR inputs have many more possibilities, for example extending arms or raising the user's eye level by applying modifiers to raw input poses. This could work transparently across applications that use WebXR inputs. On Tue, Feb 11, 2020 at 9:15 AM Matthew Tylee Atkinson < matkinson@paciellogroup.com> wrote: > Hello Immersive Web WG, > > We in the Accessible Platform Architectures WG have been reviewing the > WebXR Gamepads Module - Level 1 from an accessibility perspective, and > there were a couple of things we’d like to clarify. We’d be grateful if you > could provide any info on the following. > > 0. From the spec itself, and its description on GitHub, we gather that the > purpose of this spec is to give WebXR developers a familiar API (the > Gamepad API) to use to query the state of XR-specific controller/input > devices. (I.e. the purpose is _not_ to support general gamepads _as_ XR > controllers.) Assuming that we understood this correctly... > > 1. Have you considered supporting non-XR-specific controllers as XR input > devices? This may include devices such as general gamepads and the Xbox > Adaptive Controller. Issue 392 [0] discusses pros and cons of using the > Gamepad API, and mentions accessibility. I was wondering if there's been > any related discussion since the Gamepad API approach was adopted in PR 499? > > Thanks for any info you may be able to provide on the above. > > best regards, > > > Matthew > > [0] https://github.com/immersive-web/webxr/issues/392 > -- > Matthew Tylee Atkinson > -- > Senior Accessibility Engineer > The Paciello Group > https://www.paciellogroup.com > A Vispero Company > https://www.vispero.com/ > -- > This message is intended to be confidential and may be legally privileged. > It is intended solely for the addressee. If you are not the intended > recipient, please delete this message from your system and notify us > immediately. > Any disclosure, copying, distribution or action taken or omitted to be > taken by an unintended recipient in reliance on this message is prohibited > and may be unlawful. > > >
Received on Tuesday, 11 February 2020 18:19:00 UTC