Common technologies used in XR applications

In preparation for the meeting tomorrow, I searched for references that indicated the technologies which may be used in XR applications generally. Understanding the scope of the available technologies should assist us in identifying the accessibility challenges unique to XR.

I didn’t find a definitive reference, but it was possible to combine information from various sources.

Nevertheless, the introductory material in the following book proved useful:

Erin Pangilinan, Steve Lukas, and Vasanth Mohan, Creating Augmented and Virtual Realities: Theory and Practice for Next-Generation Spatial Computing, O’reilly Media, 2019 (available on Bookshare for individuals with print disabilities.)

On the input side, game-like controllers appear to be common. However, I also noticed discussion of hand movement and gesture recognition, as well as speech recognition.
Tracking of the user’s movements is also central, so that the displayed imagery (and, presumably, spatial audio) can be modified as the user’s attention shifts. Head tracking is apparently widely supported, but eye tracking can also be engaged.

On the output side, there are of course 3D graphics and spatial audio, but also haptic technologies (e.g., vibratory devices).

One of the fundamental design concepts is that the hardware with which the user interacts recedes into the background of awareness, with attention focused on the virtual objects that give the impression of residing in the surrounding physical environment (either substituting for it, in the case of virtual reality, or being superimposed on it in augmented reality applications).

Participants in this discussion with greater XR expertise than I have are welcome to add to the foregoing description, so that we can bring together a clear characterization of the available technologies used in XR contexts. It would also be valuable, or so it seems to me, to clarify which technologies are currently widely and publicly available, which are still predominantly used for research purposes, and when we expect some of the latter to achieve broader adoption. This information would help to shape the assumptions underlying whatever guidance we offer to potential implementers.

It seems to me that none of the technologies noted here is unique to XR applications, except perhaps movement tracking, but it’s the combination of such input and output methods (and the user interface designs that accompany them) which creates the uniqueness of XR experiences.


This e-mail and any files transmitted with it may contain privileged or confidential information. It is solely for use by the individual for whom it is intended, even if addressed incorrectly. If you received this e-mail in error, please notify the sender; do not disclose, copy, distribute, or take any action in reliance on the contents of this information; and delete it from your system. Any other use of this e-mail is prohibited.

Thank you for your compliance.


Received on Tuesday, 20 August 2019 13:51:07 UTC