Re[2]: Clarification on the notion of "Second Screen"?

------ Original message ------
From "Janina Sajka (janina@rednote.net)" <janina@rednote.net>
To: "Francois Daoust" <fd@w3.org>
Cc : "Michael Cooper" <cooper@w3.org>; "W3C WAI Accessible Platform 
Architectures" <public-apa@w3.org>
Date : 08/12/2021 10:48:28

>Dear Francois:
>
>I have CC'd the public APA email list, so that we might have a basis for
>discussion during our teleconference later today. I hope that's OK!

Sure!

>
>
>I have two comments inline of your last email below which are relevant.
>One will need further discussion, imo.
>
>Francois Daoust writes:
>>  Hi Janina,
>>
>>  I confirm that Bluetooth headset, and speakers, are included in the
>>  definition of "screen" (the charter rather uses the term "presentation
>>  display"). In particular, the draft charter explicitly mentions Bluetooth
>>  and includes the following sentence: "For the purposes of this charter,
>>  presentation displays include wireless speakers as well".
>>
>
>Excellent, and thanks for the clarification. One down, one to go!
>
>>  That said, I am not sure whether the group considers refreshable braille
>>  displays to be a possible "presentation display" as well. My limited
>>  understanding of refreshable braille displays is that they are meant to
>>  display characters, whereas the presentation displays being considered by
>>  the Second Screen Working Group are either those capable of rendering audio
>>  or video streams, or those capable of running HTML applications.
>>
>
>So, this may be the an item for us to clarify with your group. Going
>back to the days of the development of the HTML 5 specification, our
>accessibility requirements clearly considered alternative renderings of
>video/audio content important for conveyance to second screen devices
>broadly understood. We documented these requirements in a document we
>prepared to support the inclusion of markup in the HTML 5 standard, and
>our additional markup was accepted into HTML 5, really with very little
>controversary. The only item that was not standardized in HTML 5 was our
>requirement for a programmatic association of a video transcript with
>the primary video resource.
>
>The document we prepared for HTML 5 standardization was published as a
>Working Group Note by APA's predecessor Working Group, Protocols and
>Formats (PFWG) under the title: "Media Accessibility User Requirements
>(MAUR)" available here:
>
>http://www.w3.org/TR/media-accessibility-reqs/
>
>The two most relevant sections in this document to the current
>conversation would be:
>
>Sec. 3.8: Requirements on secondary screens and other devices
>
>https://www.w3.org/TR/media-accessibility-reqs/#requirements-on-secondary-screens-and-other-devices
>
>Sec. 3.5: Discovery and activation/deactivation of available alternative
>content by the user
>
>https://www.w3.org/TR/media-accessibility-reqs/#discovery-and-activation-deactivation-of-available-alternative-content-by-the-user
>
>>  Said differently, the APIs developed by the group may be used to stream
>>  audio and/or video content to a presentation display, or to establish a
>>  communication channel between a web application running on a controller
>>  device and a web application running on the presentation display. However,
>>  there are no provisions to stream pure text content to a presentation
>>  display.
>>
>>  Or do I misunderstand what refreshable braille displays encompass?
>>
>
>No, your understanding is correct. Regretably, I think we need to have a
>scoping conversation for the proposed API. I expect APA would strongly
>oppose excluding alternative content from a W3C technology API.
>
>Let me also note that alternative content is already being provided in
>commercial settings. I hope to experience the following service myself
>soon on a forthcoming trip to New York City:
>
>https://www.galapro.com/
>
>I sincerely hope we can find a way to accomodate accessibility support
>in your proposed API. Not only Broadway theater, but educational
>settings, among many other venues would benefit. And, as the above
>service demonstrates, the benefits include I18N applications, not just
>accessibility support.
I would like to reformulate one of my points. There are provisions to 
stream text tracks along with the audio and video streams in the Remote 
Playback API (and in the underlying Open Screen Protocol), so 
alternative content is supported. What is not envisioned in the Remote 
Playback API is a mode where media playback continues locally while the 
second screen plays back the alternative content.

In other words, the scenarios that the Second Screen Working Group is 
trying to enable with the Remote Playback API are "local OR remote 
playback", not "local AND remote playback in combination". The spec 
contains the following statement in particular: "The user agent SHOULD 
pause local audio and video output of the media element while the remote 
playback state is connected."

If refreshable braille displays can present themselves as able to play 
media and/or alternative content, then they can be considered as second 
screens for the purpose of the Remote Playback API. However, if the 
device is text-only, that means that the user will only experience the 
text track content. The local device will not continue to render the 
audio/video streams at the same time.

It seems to me that, with refreshable braille displays, you're more 
looking at scenarios that combine local and remote playback at the same 
time. This has been out of scope of the Remote Playback API until now 
because synchronization between devices is a hard problem that the web 
platform has not solved yet. I think that matches what we discussed in 
previous iterations and the Second Screen Working Group is likely going 
to be reluctant to extend its scope to cover that.

If an application wants more than remoting media playback, it can take 
full control of the experience, including implementing a mode that 
combines local and remote playback, through the Presentation API that 
the group also develops. That API requires the second screen to run a 
web application though. If refreshable braille displays can present 
themselves as able to run web applications, then they can also be 
considered as second screens for the Presentation API, and applications 
may use the Presentation API to synchronize local media playback with 
remote playback of alternative content.

If the "local or remote" scenario seems relevant for refreshable braille 
displays for the Remote Playback API, I think that the proposed charter 
accommodates that already, but note there may still be things to improve 
in the Open Screen Protocol on top of which the Remote Playback API may 
be implemented, as I don't see any obvious way for a display to 
advertise itself as being only able to render text tracks in the Open 
Screen Protocol currently.

Francois.

Received on Wednesday, 8 December 2021 11:15:33 UTC