Re: Accessibility of XR - Notes on the Framework for Accessible Specification of Technologies

Hi, Jason, All:

Perhaps I suffer from a failure of imagination, but I don't think the
following is correct. Since it's discussed in the literature, we should
probably test the validity of the assumption. I quote from Jason's
email:

"A challenge for screen reader users, identified in the literature, of
walking to a particular object in the virtual environment, or of
following another participant, presumably lies under 4.2.1 ("Usage
without vision"). The solution is to provide commands for performing
these functions."

I would disagree that walking or following someone would be command
driven. If it is, it's a poor piece of reality, not particularly
consonant with the reality I experience on a daily basis.

If I'm to walk, I expect to walk in the virtual world just as I walk in
my everyday world. If I'm to follow someone else, I similarly expect to
do so using my established techniques.


If commands are actually required, it's not my virtual reality that I'm
interacting with but a faux avatar supposedly representing me that I'm
directing through some designed environment.


I'm sorry if this sounds harsh, but I think it's important we express
our meaning carefully on these topics.

Best,

Janina

White, Jason J writes:
> Background
> I reviewed the Framework for Accessible Specification of Technologies (FAST)<https://w3c.github.io/apa/fast/> in connection with the accessibility of applications that are on the reality/virtuality continuum. These include virtual reality and augmented reality scenarios. Any technologies developed to standardize the creation of XR applications are clearly candidates for use of the strategies documented in the FAST. The user needs identified therein are also strongly relevant.
> Observations
> Many of the user needs identified in the FAST are pertinent to XR technologies, as are the approaches put forward for addressing them. Nevertheless, there are some issues emerging from the XR-related literature that we have reviewed which are not currently represented in the user needs or associated technical design strategies, according to my reading of the FAST document.
> Significantly, these issues can however be located (sometimes, admittedly, by stretching the principles) within the conceptual framework provided by functional performance requirements as articulated in accessibility policy standards, notably the section 508/section 255 standards in the U.S., and EN 301 549 in the E.U. See, for example, EN 301 549, clause 4<http://mandate376.standards.eu/standard/functional-statements>.
> It should be noted that the functional performance requirements need to be applied not only individually, but also in combinations in order to capture the diversity of users' needs and capabilities.
> Examples of XR Accessibility Issues Addressed by Functional Performance Requirements
> It was noted in APA Working Group discussions that some XR environments make assumptions about the user's ability to reach the controls provided by the user interface, including the ability to turn around in order to reach certain controls. Clause 4.2.8 ("Usage with limited reach") addresses this concern, although it is stated to apply to cases in which "ICT products are free-standing or installed" - not precisely the XR scenarios that we have in mind, but very close, and articulating the correct requirement.
> The specific issue of captions in 360-degree video resides under clauses 4.2.4 and 4.2.5 (note the cross-reference in the latter section).
> The problem, identified in the literature, of providing nonvisual descriptions of a virtual scene without overwhelming the user with information concerning all of the virtual objects in the vicinity, is not so easily classified. It appears to me to derive from elements of 4.2.1 ("Usage without vision") and 4.2.10 ("Usage with limited cognition"). The cognitive limitations are not in this case necessarily due to learning or cognitive disability, but rather to the cognitive demands of dealing with extensive information about a visual scene delivered in a serial (textual) communication channel.
> A challenge for screen reader users, identified in the literature, of walking to a particular object in the virtual environment, or of following another participant, presumably lies under 4.2.1 ("Usage without vision"). The solution is to provide commands for performing these functions. I suspect similar issues could arise for those with "limited manipulation or strength" (4.2.7).
> Considerations for Further FAST Development
> Functional performance criteria, individually and in combination, could serve as a useful conceptual tool with which to identify and understand accessibility challenges emerging from actual or proposed new technologies. They could also serve as organizing principles for understanding users' access needs. I think consideration should be given to how to integrate them appropriately into the FAST. In addition, there are some XR-specific user requirements that could be included, or given as examples in the document.
> 
> Comments are most welcome.
> 
> Regards,
> 
> Jason.
> 
> 
> ________________________________
> 
> This e-mail and any files transmitted with it may contain privileged or confidential information. It is solely for use by the individual for whom it is intended, even if addressed incorrectly. If you received this e-mail in error, please notify the sender; do not disclose, copy, distribute, or take any action in reliance on the contents of this information; and delete it from your system. Any other use of this e-mail is prohibited.
> 
> 
> Thank you for your compliance.
> 
> ________________________________

-- 

Janina Sajka

Linux Foundation Fellow
Executive Chair, Accessibility Workgroup:	http://a11y.org

The World Wide Web Consortium (W3C), Web Accessibility Initiative (WAI)
Chair, Accessible Platform Architectures	http://www.w3.org/wai/apa

Received on Wednesday, 27 March 2019 19:51:02 UTC