Re: Zoom documentation on sign language interpretation feature

As a bit of an update on this discussion, I have figured out that the new sign language interpretation feature in Zoom has been out since October of last year, but it is something that has to be turned on at the enterprise or account level. I verified that I have access to these account setting in my Pearson Zoom account where it has been enabled at the Enterprise (i.e., company-wide) level, but I would have to manually turn it on in my own settings for it to be used in any Zoom meetings that I schedule using my Pearson account.

I found this post on the DO-IT website useful:
Zoom Announces New Features in Support of Sign Language | DO-IT (washington.edu)<https://www.washington.edu/doit/zoom-announces-new-features-support-sign-language-0>

I would draw attention to the following paragraph at the end of the page (which I have copied below)...especially the third problem they list:
"AccessComputing staff has been actively testing this new feature and providing ongoing feedback to Zoom’s accessibility team. There are a few known limitations. For example, as of November 2022, the interpreter window is not captured when the meeting is recorded. This limitation is expected to be addressed for cloud recordings in an upcoming release. A second problem is that the interpreter channel only works in the main room, not in breakout rooms. Interpreters can be assigned to breakout rooms, but in the breakout room they become regular participants again, so participants who need their services will need to fall back to the old way of doing things (e.g., pinning and spotlighting), until they return to the main room. A third problem is that interpreters by default do not have the ability to unmute themselves so they can speak on behalf of a participant. The host must change a setting for each interpreter to allow them to speak, which is burdensome for the host and adds an extra layer where things can go wrong."

Hope this helps.

--Steve



Steve Noble
Principal Researcher, Accessibility
Psychometrics & Testing Services

Pearson

502 969 3088
steve.noble@pearson.com<mailto:steve.noble@pearson.com>

[https://ci3.googleusercontent.com/proxy/xFjftXlwMzpdFeTtDgc4_IwyMYm8ThtQHIsgElkS8fyiCO2M7ZM0WaO7r2uy-bmKAe5S2sIcg7d-mwbD4ArkJhyafHke-SgJ2ui8DoGoBhZw4YIyWeK3LUozNMwBff4JR2tdu8nZ2fvoNvkkA06KNw9-s3P9UvYsHSTphHss6X0=s0-d-e1-ft#http://accessibility4school.pearson.com/access/4c49fe02-e204-46b4-b6f0-82f5a3f159cb/pearson-accessibility.jpg]

[NSF's Convergence Accelerator - 2022 Cohort Member]<https://beta.nsf.gov/funding/initiatives/convergence-accelerator>


________________________________
From: Steve Noble <steve.noble@pearson.com>
Sent: Wednesday, May 17, 2023 10:12 AM
To: RQTF <public-rqtf@w3.org>
Subject: Zoom documentation on sign language interpretation feature

I found this page on the Zoom website which details how this works:
https://support.zoom.us/hc/en-us/articles/9644962487309-Using-sign-language-interpretation-in-a-meeting-or-webinar

This should be helpful going forward.

--Steve




Steve Noble
Principal Researcher, Accessibility
Psychometrics & Testing Services

Pearson

502 969 3088
steve.noble@pearson.com<mailto:steve.noble@pearson.com>

[https://ci3.googleusercontent.com/proxy/xFjftXlwMzpdFeTtDgc4_IwyMYm8ThtQHIsgElkS8fyiCO2M7ZM0WaO7r2uy-bmKAe5S2sIcg7d-mwbD4ArkJhyafHke-SgJ2ui8DoGoBhZw4YIyWeK3LUozNMwBff4JR2tdu8nZ2fvoNvkkA06KNw9-s3P9UvYsHSTphHss6X0=s0-d-e1-ft#http://accessibility4school.pearson.com/access/4c49fe02-e204-46b4-b6f0-82f5a3f159cb/pearson-accessibility.jpg]

[NSF's Convergence Accelerator - 2022 Cohort Member]<https://beta.nsf.gov/funding/initiatives/convergence-accelerator>

Received on Thursday, 18 May 2023 12:42:01 UTC