- From: Joshue O Connor <joconnor@w3.org>
- Date: Tue, 10 Sep 2019 09:40:13 +0100
- To: John Foliot <john.foliot@deque.com>
- Cc: Alice Boxhall <aboxhall@google.com>, Léonie Watson <lw@tetralogical.com>, Janina Sajka <janina@rednote.net>, James Craig <jcraig@apple.com>, W3C WAI Accessible Platform Architectures <public-apa@w3.org>, RQTF <public-rqtf@w3.org>, public-personalization-tf <public-personalization-tf@w3.org>
On 09/09/2019 15:44, John Foliot wrote: > Josh writes: > >> the annotations a la ARIA or something new - that does provide the > vocabulary for the areas we have identified like Navigation, Object and > Interaction semantics. > > I'll only add that there may also be an additional need/wrinkle there Josh, > and that's in the Personalization vein, where the Personalization TF is > attempting to crack a tack-on issue of making user-interfaces more > customizable. > > So, for example, in XR environments, will the current taxonomy terms found > for data-action suffice? ( > https://w3c.github.io/personalization-semantics/content/index.html#action-explanation) > Will they even be applicable? (If no, are there any suggestions on tackling > the user-need there?) Thats useful to know John, thanks for passing that on. My initial reaction is that kind of thing should be supportable by relevant APIs, and I dare say, would be relatively easier, as it seems related to current 'knowns'. > Will the architecture support something like data-distraction, where the > goal would be to remove aspects of the content that are non-essential and > may have a negative impact on some users? ( > https://w3c.github.io/personalization-semantics/content/index.html#distraction-explanation > ) Again, useful to know - and this kind of thing could be useful in terms of the kind of 'modality muting' - where unnecessary renderings of an environment (such as the visual drawing), can be cut out of the data stream. Thanks Josh > > At any rate, more questions without concrete answers. > > I hope you all have a great TPAC, and I am sorry to be missing it this year. > > JF > > On Mon, Sep 9, 2019 at 7:37 AM Joshue O Connor <joconnor@w3.org> wrote: > >> Hello Alice and all, >> >> On 09/09/2019 00:50, Alice Boxhall wrote: >> >>> [...] >>> Everything Léonie said! >> Great, thanks for the feedback both. It's very useful. >>> My main concern would be that the existing ARIA vocabulary/existing AT >>> interaction patterns would be too limiting for UX designed for an >> immersive >>> environment (orthogonal to the AOM API design), which Janina touches on >>> below. >> Right, and mine - so it would be great if we could get to an >> understanding of what a baseline semantic architecture for XR would look >> like, and then we can work out what are the annotations a la ARIA or >> something new - that does provide the vocabulary for the areas we have >> identified like Navigation, Object and Interaction semantics. >>>> 2.) Do we need bi-directionality for good XR support? Semantics can be >>>>> consumed by user agents but may be modified in an imersive environment >>>>> and change as interactions are happening. Like React is data driven, XR >>>>> semantics may be interaction or results driven. >>>>> >>>>> 3.) What would be the ideal architecture to support XR accessibility? >> We >>>>> seem to be currently aiming at patching XR with current and even legacy >>>>> AT, so that architecture may be temporary, or move away from browser >> and >>>>> API interactions towards AT being embedded in an immersive environment. >>>>> What does "good" look like in this situation? >>> These are really interesting and important questions - I don't know >> enough >>> about XR to start answering them. >> >> This is really why meeting and discussing is going to be (a lot of fun) >> and very helpful. >> >>>> 4) Are Object Oriented approaches to accessible XR preferable to >>>>> declarative or author applied semantics? >>>>> Please confirm whether 11:00 Thursday works. >>> It's open for me, although if the topic is primarily going to be XR I'm >>> unclear why this would be a separate session from the proposed plenary >>> session <https://www.w3.org/wiki/TPAC/2019/SessionIdeas#XR_Accessibility >>> >>> on Wednesday. >> >> Well that is a rather general session/conversation around how to provide >> better engagement on XR with the current a11y community, as well as let >> people know of the work we are all doing in this space. For example >> Léonie and Doms upcoming workshop, and our current XAUR drafts as well >> as how people can engage with this work who may not be monitoring it so >> closely. So I think this will be more general whereas the AOM topic is >> quite specific and your expertise is needed. >> >> Thanks, and I look forward to working with you. >> >> Josh >> >> -- >> Emerging Web Technology Specialist/Accessibility (WAI/W3C) >> >> >> -- Emerging Web Technology Specialist/Accessibility (WAI/W3C)
Received on Tuesday, 10 September 2019 08:40:22 UTC