- From: Joshue O Connor <joconnor@w3.org>
- Date: Mon, 16 Nov 2020 16:13:29 +0000
- To: "public-apa@w3.org" <public-apa@w3.org>
- Message-ID: <10790223-0b0e-292d-d77d-a19fd3660f3e@w3.org>
Hi all, The following are notes from me on the issue tracker items I had an action to review for APA. [1] Spec language precludes non-visual uses #815 https://github.com/immersive-web/webxr/issues/815 This thread is really very interesting. They are basically discussing multi-modal access, and theorizing how this may be supported. WebXR definitions - as found at the time of writing of the issue, focus very much on presenting ' imagery to the user' which doesn't accommodate the range of AT or other devices that may be used to access XR. The BrailleSense Polaris is given as an example by the OP. He also makes the point that in the WebXR charter, there is no mention that this work should or does emphasise that the WebXR work needs to focus just on 'visual imagery'. https://www.w3.org/2018/08/immersive-web-wg-charter.html Interestingly the OP (@frastlin) - then goes on to suggest that the scope of the charter is expanded to: * Language in the general spec switched from visual to a-modal. * Examples given of XR experiences in modalities other than just visual. There were then a bunch PRs but its hard to see what came out of the thread - or its impact. Worth discussing and keeping an eye on. These come from @frastlin the OP. > I [submitted a PR](https://github.com/immersive-web/webxr/pull/925) for many of the changes we talked about for explainer.md. > I [did another pr](https://github.com/immersive-web/webxr/pull/927) for the spec itself to change the definition of XR device. > I [submitted a PR with an example to connect WebXR and Web Audio](https://github.com/immersive-web/webxr/pull/930) > I [Opened up an issue to remove WebGL.](https://github.com/immersive-web/webxr/issues/926) I'm not clear if these were accepted or resulted in changes to charter - I've asked for an update and some clarification, so I'll feedback to APA. Evaluate how/if WebXR should interact with audio-only devices. #892 https://github.com/immersive-web/webxr/issues/892 This thread has a really good list of audio only head tracking devices. This comment has been added to a Future milestone in WebXR. Add Mode that Does Not Require WebGL#926 https://github.com/immersive-web/webxr/issues/926 Another interesting thread. The OP is suggesting a 'third mode' - that may not require WebGL. Worth tracking but I have some questions about how to synchronise content that is drawn with WebGL and other modalities. To me the XR sessions need to be in sync all the time, if non WebGL sessions are initiated. I'm not sure how that would work. Use native keyboard / Augmented Reality #971 https://github.com/immersive-web/webxr/issues/971 This thread has no traction but asks the question about keyboard accessibility in AR sessions. Potentially important, but didn't trigger a response from the WebXR team. I had a comment on this myself and have asked a follow on question. Content in immersive session should need not be search around#992 https://github.com/immersive-web/webxr/issues/992 This issues seems generic to me and not specific to a11y. It is around focus on local space origin but there is a comment relating to customisation of orientation and other a11y aspects (that I also responded to). Ok, thats it - I look forward to discussing. Josh [1] https://w3c.github.io/horizontal-issue-tracker/?repo=w3c/a11y-review -- Emerging Web Technology Specialist/Accessibility (WAI/W3C)
Received on Monday, 16 November 2020 16:13:36 UTC