- From: Taliesin Smith <talilief@gmail.com>
- Date: Wed, 28 Jul 2021 13:05:59 -0230
- To: Iulia Brehuescu <iulia.brehuescu@mediamonks.com>
- Cc: w3c-wai-ig@w3.org
- Message-Id: <B621BBA1-A1D4-45A0-A223-6DA807BCAE7E@gmail.com>
Hi Iulia, As Jonathan mentioned we have examples of accessible interactive science and math simulations, and even though the URL says “prototypes”, we have 10 simulations with fully implemented interactive description designs that are not technically prototypes. You can filter our a main website based on Accessibility Feature https://phet.colorado.edu/en/simulations/filter?a11yFeatures=accessibility&sort=alpha&view=grid <https://phet.colorado.edu/en/simulations/filter?a11yFeatures=accessibility&sort=alpha&view=grid> Our simulations that can be made accessible are built in HTML5 and use any of 3 visual rendering technologies (canvas, SVG and WebGL) individually or together. We use a custom scene graph called scenery. We created an accessibility API for scenery that creates a semantically rich HTML layer that we refer to as a “Parallel DOM” or “PDOM”. The PDOM houses dynamic current state information, and then it also leverages aria-live regions, aria-valuetext and many other aria attributes to deliver descriptions of relevant changes as they happen. We have written several academic papers about this work. The most relevant papers might be: CHI2020 - Storytelling to Sensemaking: A Systematic Framework for Designing Auditory Description Display for Interactives https://dl-acm-org.colorado.idm.oclc.org/doi/10.1145/3313831.3376460 <https://dl-acm-org.colorado.idm.oclc.org/doi/10.1145/3313831.3376460> W4A2018 - Parallel DOM Architecture for Accessible Interactive Simulations https://dl-acm-org.colorado.idm.oclc.org/doi/10.1145/3192714.3192817 <https://dl-acm-org.colorado.idm.oclc.org/doi/10.1145/3192714.3192817> For the past year or so we have been working on a voicing feature that learners can customize and choose what they want voiced as they interact. For this “Voicing” feature, learners do not need screen reader software. We research and use sound and sonification as well. Taliesin Smith talilief@gmail.com ~.~.~ Also reachable at: Taliesin.Smith@colorado.edu Inclusive Design Researcher PhET Interactive Simulations https://phet.colorado.edu/en/accessibility Physics Department University of Colorado, Boulder > On Jul 26, 2021, at 6:30 AM, Iulia Brehuescu <iulia.brehuescu@mediamonks.com> wrote: > > Hello, > > I am looking for some thoughts on how to address accessibility on webgl experiences. For reference, what I mean by a webgl experience is something similar with what Netflix did for The Witcher <https://witchernetflix.com/en-gb>. > > What would be the best/impactful approach: > Provide a separate more linear/simpler experience while maintaining the main user flows? > Or provide "Accessibility Settings" where users can change font style or enable/disable animations etc? (this idea I took from games & accessibility) > Any feedback would be appreciated! >
Received on Wednesday, 28 July 2021 15:36:18 UTC