Re: Screen Readers, Braille Displays and Live Regions

Thanks Sean,
I’ll add these notes to our GitHub issue.

I would be very interested in the results of any testing you do, so please keep in touch.

In my research interview, the participant was using NVDA with a Braille display. The ting that is interesting to me from an implementation perspective is that some live region content was delivered in Braille and some was not. 

Live region content triggered via native interactions (e.g. sliders, buttons, etc) were simultaneously delivered in voice and in Braille. 

One custom keyboard interaction (with role web application) that allows for bi-manual  which also triggers content to be delivered via live regions only delivered the content in voice (not Braille).

I think we will be submitting a bug report to NVDA, in case it is a bug on their end.

Taliesin 




> On Nov 28, 2020, at 7:49 PM, Murphy, Sean <SeanMichael.Murphy@team.telstra.com> wrote:
> 
> On this topic the general rule with two of the four commonly used screen readers:
> VoiceOver on mac or ioS using Braille shows the same information being spoken.
> Jaws does have different levels (Structured, Line and screen reader announced) – I have not tested Jaws with a Braille display with Live regions. Something I will do in the near future.
> TalkBack does not support braille. You have to install the Braille Back app. How it handles live region I do not know.
> NVDA need to investigate.
> Narrator the same.
>  
>  
>  
> Regards
> Sean Murphy
>  
> Sean Murphy | Senior Digital System specialist (Accessibility)
> Telstra Digital Channels | Digital Systems
> Mobile: 0405 129 739 | Desk: (02) 9866-7917
> Digital Systems Launch Page <https://confluence.in.telstra.com.au/display/DCSYS/Digital+Systems+-+Able+Home>
> Accessibility Single source of Truth <https://confluence.in.telstra.com.au/display/DCSYS/Accessibility+Resources>
>  
> From: Jonathan Avila <jon.avila@levelaccess.com <mailto:jon.avila@levelaccess.com>> 
> Sent: Tuesday, 17 November 2020 7:40 AM
> To: W3C WAI ig <w3c-wai-ig@w3.org <mailto:w3c-wai-ig@w3.org>>
> Subject: RE: Screen Readers, Braille Displays and Live Regions
>  
> [External Email] This email was sent from outside the organisation – be cautious, particularly with links and attachments.
> In my experience there are limitations in what aria-live being announced by some screen readers.    People who are deafblind would rely on braille output from screen readers.  There are speech modes on the braille display allowing users to display what was announced in speech but these modes may be difficult to enable and switch back and forth.   I would reach out to screen reader vendors to find out what can be done to change how they work with aria-live regions.
>  
> Jonathan
>  
> From: Taliesin Smith <talilief@gmail.com <mailto:talilief@gmail.com>> 
> Sent: Friday, November 13, 2020 9:54 AM
> To: Steve Green <steve.green@testpartners.co.uk <mailto:steve.green@testpartners.co.uk>>
> Cc: W3C WAI ig <w3c-wai-ig@w3.org <mailto:w3c-wai-ig@w3.org>>
> Subject: Re: Screen Readers, Braille Displays and Live Regions
>  
> CAUTION: This email originated from outside of the organization. Do not click links or open attachments unless you recognize the sender and know the content is safe.
>  
> Thanks for the continued conversation and details on how screen readers and braille displays work.
>  
> I am asking these questions because I am part of team that builds accessible interactive science and math simulations used for learning. 
>  
> In a recent interview with a blind person using our simulation called Ratio and Proportion, the interviewee told me that he was not receiving Braille output when moving Both Hands. I could hear that he was indeed receiving the aria live output, though. 
>  
> The “Both Hands” interaction is a custom interaction that i simpleminded using web application role and aria live. It enables bimanual input which allows the leaner to move two objects (the Left Hand and the Right Hand) at the same time. 
>  
> Learners can move the Left Hand up and down with the W and S keys and they can move the Right Hand up and down with the up and and down arrow keys to explore the concepts of ratio and proportion. While moving the hands they get description, sonification to support their exploration of the concepts.
>  
> This learning tool also has two native sliders that allow learners to move the hands individually. The aria-valutext and the aria live alerts triggered through the use of the individual Left Hand and Right sliders were outputted in Braille.
>  
> I am thinking there must be something different about the aria live content triggered by a change via slider interaction versus the aria live content that is triggered via a customs interaction that is created through web application role.
>  
> Here is a link to Ratio and Proportion <https://phet-dev.colorado.edu/html/ratio-and-proportion/1.0.0-dev.73/phet/ratio-and-proportion_en_phet.html> if you want to try it out. It is a prototype and we are still actively working on the design of the interactive description which is accessed via screen reader software (and braille displays). To experience the issue, you need to us the Both Hands interaction with a screen reader and braille display. 
>  
> Since I don’t have a braille display, I wanted to first know if there was something fundamentally different about accessible text delivered via a screen reader and a Braille display. 
>  
> It seems there can be differences, so then my next question in our case, is there something different about the native sliders and the custom Both Hands interactions that is preventing the Braille from displaying. Both the sliders and the Both Hands interaction deliver text via ARIA live regions. In the case of the sliders, the content is displayed. In the case of the Both Hands interaction the text is not displayed.
>  
> Of course, it might be an issue with the screen reader if the screen reader determines what should be displayed in Braille. The screen reader was NVDA.
>  
> If nothing else, the issue is a lot of food for thought. 
>  
> Taliesin
>  
>  
>  
>  
>  
> 
> On Nov 10, 2020, at 11:46 AM, Steve Green <steve.green@testpartners.co.uk <mailto:steve.green@testpartners.co.uk>> wrote:
>  
> As far as I can tell, that page is intended for people who design screen reader software, not for web developers. Are you asking these questions as a web developer or an AT designer?
>  
> Steve Green
> Managing Director
> Test Partners Ltd
>  
>  
> From: Taliesin Smith <talilief@gmail.com <mailto:talilief@gmail.com>> 
> Sent: 10 November 2020 15:01
> To: W3C WAI ig <w3c-wai-ig@w3.org <mailto:w3c-wai-ig@w3.org>>
> Subject: Screen Readers, Braille Displays and Live Regions
>  
> Hi Folks,
> I have a few very general questions about screen reader software (SR), braille displays (BD), and ARIA live regions.
>  
> 1. Is it safe to assume that text is text and that the screen reader software (SR) and braille display devices (BD) handle or display text-base content in the same way, i.e. through the same channels simultaneously?
>  
> 2. In theory, is text marked up in an element properly designated as an aria live-region in some way different than text marked up in a paragraph tag? I mean really different once the text is recognized by the SR or the BD.
>  
> 3. Since users often use both SR and BD together, are there special situations that web developers should be aware of to ensure text is accessible and deliverable by both technologies simultaneously? Especially, in the case of custom interactions and where aria live-regions are being employed.
>  
> Thanks for any thoughts or resources people may know about.
>  
> I was just reading this resources from MDN which mentions “channels”
> https://developer.mozilla.org/en-US/docs/Web/Accessibility/ARIA/ARIA_Screen_Reader_Implementors_Guide <https://developer.mozilla.org/en-US/docs/Web/Accessibility/ARIA/ARIA_Screen_Reader_Implementors_Guide>
>  
> Taliesin
>  
> Taliesin Smith
> talilief@gmail.com <mailto:talilief@gmail.com>
>  
> ~.~.~
> Also reachable at:
> Taliesin.Smith@colorado.edu <mailto:Taliesin.Smith@colorado.edu>
> Inclusive Design Researcher
> PhET Interactive Simulations
> https://phet.colorado.edu/en/accessibility <https://phet.colorado.edu/en/accessibility>
> Physics Department
> University of Colorado, Boulder

Received on Monday, 30 November 2020 15:11:45 UTC