Re: 48-Hour Call for Consensus (CfC): Publish XAUR FPWD

+1 and wow to Mike for the review!

On Tue, Jan 28, 2020, 6:45 PM Matthew Tylee Atkinson <
matkinson@paciellogroup.com> wrote:

> I support this going to FPWD.
>
> It was an enjoyable read and will be a helpful resource.
>
> Here are some questions, observations and typos from my review. (I realise
> there's quite a bit here - sorry I only got to this now - and I don't think
> that any of it should block publication as a FPWD.)
>
> I have not yet reviewed the related documents in section 5, but intend to
> do so.
>
> best regards,
>
>
> Matthew
>
>
> // Things that may need clarifying
>
> * [3] The domain/site says it's "WalkinVRDriver" (we have "Walking").
> Also, it sounds like this should be a link, to provide context? (In §3.5
> it's presented as a link.)
>
> * [3] I realise it may just be a character limit in the footnote IDs, but
> is it possible to call the "able-game" footnote ID "able-gamers" as that's
> their full name?
>
> * [3.2] "There will be many modality aspects for the developer and/or
> content author to consider:" - it looks like there's going to be a list of
> things there, but a "NOTE:". (The section ends by saying what's coming up
> in the input section, so I'm not sure if the colon relates to "the
> following sections".)
>
> * [3.5] The Xbox Adaptive Controller is mentioned. At the moment I'm not
> sure that it /is/ supported as an XR controller (ref our discussion about
> the WebVR Gamepad Module [0]) - though it certainly sounds like that would
> be a good idea for accessibility.
>
> * [3.6] I'm not clear on the first sentence. I mean it seems to stand
> alone. (The point of the section as a whole that this is an important
> requirement is an important one, of course.)
>
> * [3.7, 3.8] I think a great example that seems close to this, but not
> quite the same thing, is Xbox's co-pilot feature, which lets two people use
> two separate controllers, but to control one player character. This allows
> someone with disabilities to have a buddy who helps them with the bits that
> they find less/in-accessible. I wonder if it's worth mentioning somewhere
> around here as an example of good practice in the industry? [1]
>
> * [3.9] Some great points here. I wonder if this is an appropriate place
> for some examples.
>
>     - Do I understand correctly that the first point is about things that
> in the real world may be inherently accessible affordances, like tactile
> paving, or the presence of a handle or a plate on a door, but we are trying
> to figure out how to expose those in a natural way to the XR user?
>
>     - The last point is important and evokes two different situations for
> me - are these the ones you are describing and/or, if not, do you think
> either warrants mentioning?
>
>         a. Situations in which an underlying interaction is presented in a
> certain way, because it works for most people. But then, assistive
> technology is layered on top of that interaction in its rendered modality,
> but it would've been simpler for the user to have had a different
> representation. A classic example of this is drag-and-drop, which is
> frequently represented visually on web pages. It is possible to use ARIA
> etc. to convey things like draggable things and drop zones (making the
> problem 2D in some cases), but in many cases it may've been simpler to
> present the problem as specifying the order of items in a list, which can
> be done as a simple 1D problem with numbers or up/down keystrokes/buttons
> as inputs. This is a problem of adapting a particular rendering of the
> problem rather than the underlying problem.
>
>         b. Situations in which an adaptation has been carried out to
> transform one modality into another, to make it accessible, but we need a
> way to convey the user's interaction in the adapted modality back into the
> original one. E.g. reading a book in XR may be presented visually, with
> page-turning gestures required, but if a screen-reader is being used to
> read the text, keyboard interactions may be more suitable for the user.
> This is a problem of bridging between AT and content, and may imply the
> need for an accessibility layer for XR.
>
> * [4.x, 4.6] The section on voice commands really made me think, as with
> other user needs sections, that we need existing AT to be able to work with
> XR, where it makes sense, in order to avoid wheel reinvention. Having said
> that, content design choices can make so much difference and ensure that
> people are kept immersed in the world, rather than being brought back out
> by the experience of using traditional AT. I think the focus on user needs
> - not how to meet them - is very helpful here, as those needs are not going
> to change.
>
> * [4.7] Can we add an extra requirement that says that the content is
> designed so that shape is used instead of just colour? I suggest that
> because if the requirement is met, and the user has only minor difficulties
> with contrast, that could be sufficient for them to not need to engage any
> specific display mode, and thus be a more immersive experience. There are
> several examples from the world of games that we could reference [2] (that
> list actually misses one that I think is particularly good, but I’d have to
> ask Ian about it, as I can’t remember the name of the game I’m thinking of
> at the moment).
>
> * [4.8] Can we add an extra requirement to allow for scalable font sizes
> in both UI and content? This could negate the need for screen
> magnification. This (and many other adaptations, even at the same time) has
> been done successfully in research [3], and APIs like Dynamic Text on iOS
> and large text on Android are getting close.
>
> * [4.16] I'm not an expert on this, but I am wondering if mono audio
> should be provided as an alternative for people with hearing impairments
> (as opposed to, or in addition to, the proposed)?
>
>
> // Housekeeping questions
>
> * Shouldn't the URL for the Latest Published Version use upper-case "XAUR"
> and not lower-case "xaur" to match the acronym case of the "TR" in the URL?
> Also, WCAG's URL includes upper-case "WCAG" and not lower-case "wcag".
>
> * The names of the APA XR accessibility docs are called just "XR" and
> don't refer to "WebXR" - is that to help disambiguate it from the WebXR
> WG's docs, or because APA is talking about general XR (web or native)?
>
> * [4.x] It would be really neat, for when people are talking about the
> XAUR, if the section numbers could be made to match the user need numbers.
>
>
> // Typos
>
> * [1.1, 4.1] 360° is typed using a superscript 0 - should be &#8451;
>
> * [1.2, 1.3] "computer-mediated" (with the hyphen) may be slightly clearer?
>
> * [3.3] Would it be clearer if the note started its own paragraph (this
> may well be overridden by W3C house style, with which I'm unfamiliar :-)).
>
> * [3.3] In the first three input device description, the apostrophe that
> should be in "user's" is missing.
>
> * [3.3] The eye-tracking description refers to voice.
>
> * [3.5] I gather the official spelling is "Xbox" (not "XBOX").
>
> * [4.2] REQ 1 should be REQ 1b.
>
>
> // Bits I particularly liked
>
> It was all very helpful and a comprehensive introduction to XR, but...
>
> * [3] "Gamification of VR forces game dynamics on the user..." - I had a
> question/suggestion here but it was answered in the sixth of the last set
> of bullet points at the bottom of the section :-). Also the point about
> authoring tools is a vital one.
>
> * [3.2] I thought the way accessibility was cast in the context of
> multimodality is really neat.
>
> * [4.12, 4.19] Particularly great requirements :-).
>
>
> I’d be happy to file all of these as GitHub issues on <
> https://github.com/w3c/apa> if that helps (or if you would prefer I only
> file a subset, please let me know).
>
> best regards,
>
>
> Matthew
>
> [0] <https://lists.w3.org/Archives/Public/public-apa/2020Jan/0019.html>
> [1] <
> https://beta.support.xbox.com/help/account-profile/accessibility/copilot>
> [2] <
> http://gameaccessibilityguidelines.com/ensure-no-essential-information-is-conveyed-by-a-colour-alone/
> >
> [3] <http://www.eecs.harvard.edu/~kgajos/research/supple/>
> [4] <
> http://gameaccessibilityguidelines.com/avoid-repeated-inputs-button-mashingquick-time-events/
> >
>
> --
> Matthew Tylee Atkinson
> --
> Senior Accessibility Engineer
> The Paciello Group
> https://www.paciellogroup.com
> A Vispero Company
> https://www.vispero.com/
> --
> This message is intended to be confidential and may be legally privileged.
> It is intended solely for the addressee. If you are not the intended
> recipient, please delete this message from your system and notify us
> immediately.
> Any disclosure, copying, distribution or action taken or omitted to be
> taken by an unintended recipient in reliance on this message is prohibited
> and may be unlawful.
>
>
>

Received on Tuesday, 28 January 2020 23:48:33 UTC