Re: [media-and-entertainment] Use case description: responsive grafics synced with video (#35)

As discussed in yesterday's [Media Timed Events / WICG DataCue API call](https://www.w3.org/2020/04/20-me-minutes.html), here is a short use case description that may be included as an additional use case in the [IG Note](https://w3c.github.io/me-media-timed-events/#dfn-media-timed-event):

**Media stream with video and synchronized graphics**

A content provider wants to provide synchronized graphical elements along with a video. Graphics and video are delivered separately to a client device, either in-band (i.e. multiplexed with the media stream) or as a sidecar file. The graphical element stream or file contains media timed events for start and end time of each graphical element, similar to a subtitle stream or file. A graphic renderer takes the graphical element stream or file as input and renders it on top of the video image according to the media timed events.

The purpose of rendering the graphical element at the client device may be that the graphic renderer may optimize it for the devices display parameters like the display aspect ratio and its orientation. Another use case might include localization of the graphical element.

The synchronization between video and graphics should ideally be frame accurate.



-- 
GitHub Notification of comment by pthopesch
Please view or discuss this issue at https://github.com/w3c/media-and-entertainment/issues/35#issuecomment-616995589 using your GitHub account

Received on Tuesday, 21 April 2020 07:07:05 UTC