- From: Steve Green <steve.green@testpartners.co.uk>
- Date: Thu, 23 Jul 2020 23:05:19 +0000
- To: WAI Interest Group discussion list <w3c-wai-ig@w3.org>
You are completely misunderstanding what I wrote. The point is that there was a consensus that "synchronised media" means that a single action causes both the audio and video tracks to start, rather than requiring two separate actions. There may not necessarily be any sound or video content at the start - how they relate to each other is an editorial decision. With regard to audio description, you obviously wouldn't superimpose it over dialogue. However, sometimes there are not sufficient gaps in the dialog or other audio content, which is why we have the Extended Audio Description level AAA success criterion - https://www.w3.org/TR/WCAG21/#extended-audio-description-prerecorded. I don't know why you think that anyone would remove the synchronisation between audio and video. When I referred to "unsynchronised video" I specifically said that this refers to cases where the relative timing of the audio and video tracks is not important. An example would be where background music is just "fluff" that provides no theatrical effect (such as crescendos, mood changes, sudden pause etc.). Where we do agree is that there is not a good official definition of synchronised media. Perhaps that can be addressed in a future WCAG update. Steve -----Original Message----- From: Janina Sajka <janina@rednote.net> Sent: 23 July 2020 23:30 To: Steve Green <steve.green@testpartners.co.uk> Cc: WAI Interest Group discussion list <w3c-wai-ig@w3.org> Subject: Re: When is something a web app and when is it synchronized media Simply starting at the same time is a poor definition of synchronised media. Limit yourself that way and your video descriptions rendered in audio won't just fit inbetween dialogue, they'll step all over the dialogue. Furthermore, it's a strong disservice to people with significantly limited hearing who count on cues from lip movement to help them understand audio they only barely hear. Unsynchronize that and a life strategy that works in person fails in web media--not good. Best, Janina Steve Green writes: > I feel your pain. We have a client who builds e-learning modules using Storyline 360 and the code is totally disgusting. Sadly, you can’t do anything about most of it. You can’t fix any of the numerous WCAG non-conformances in the player and you are extremely limited as to what you can fix in the course content. If you want to create accessible e-learning modules, Storyline is an absolutely terrible place to start. In my experience all these “rapid development” e-learning authoring tools are terrible, but Storyline is the worst. > > I recently started a discussion on the WebAIM website, seeking views on what the phrase “synchronised media” actually means, because it is not adequately defined in the WCAG. To my surprise, absolutely no one agreed with my interpretation of the phrase! For consistency, I now work to everyone else’s definition even though I don’t like it. > > My interpretation of WCAG's definition of "synchronised media" was that the audio and video need to be synchronised, such as when you have a talking head i.e. the relative timing of the audio and video is important. I didn't believe that merely having audio and video present at the same time is sufficient to constitute "synchronised media" if their relative timing does not matter. It would simply be "multimedia". The word "synchronised" has a very specific meaning and I didn't think we can simply ignore it. > > However, everyone else was of the view that audio and video media are synchronised if a single user action causes them both to either start or stop at the same time, regardless of whether their relative timing is important. > > This implies that there is no such thing as "unsynchronised media" - there is only audio-only, video-only and synchronised media. But that begs the question why WCAG uses the word "synchronised" (which is not in common usage) rather than "multimedia", which everyone understands (and does not imply that the relative timing of the audio and video tracks is important). > > To return to your question, it sounds like you have a mixture of static content and synchronised media. Note that screen readers have nothing to do with whether content is regarded as being synchronised media or not. The audio generated by the screen reader is not part of the content for the purposes of a WCAG audit. > > Steve Green > Managing Director > Test Partners Ltd > > > From: Matthew Kreiling <kreiling@gmail.com> > Sent: 23 July 2020 18:38 > To: w3c WAI List <w3c-wai-ig@w3.org> > Subject: When is something a web app and when is it synchronized media > > I have been working to assess and find a process for creating accessible eLearning modules using Articulate Storyline 360. > > The content of the "slides" within the player is coded as a bunch of svg images, including the text. They are hidden using aria-hidden, but screen readers are provided with a hidden text that receives focus programmatically when the user uses the arrow keys to navigate. > > Visually, focus is indicated appropriately and synchronized with the screen reader. > > I hate it as a developer, but as an assessor of WCAG compliance, I am unsure how to treat the content within the player. Sometimes it is static, sometimes it is animations accompanied by audio with optional closed captions, and sometimes there are buttons to interact with it. > > Do I treat the content within the player as synchronized media? > > > > Matthew Kreiling > > We're all in it together. > > "To live is so startling it leaves little time for anything else." > —Emily Dickinson -- Janina Sajka https://linkedin.com/in/jsajka Linux Foundation Fellow Executive Chair, Accessibility Workgroup: http://a11y.org The World Wide Web Consortium (W3C), Web Accessibility Initiative (WAI) Co-Chair, Accessible Platform Architectures http://www.w3.org/wai/apa
Received on Thursday, 23 July 2020 23:05:35 UTC