[whatwg] Timed tracks: feedback compendium

On Wed, Oct 20, 2010 at 9:59 AM, Odin Omdal H?rthe <odin.omdal at gmail.com> wrote:
> On Wed, Sep 8, 2010 at 1:19 AM, Ian Hickson <ian at hixie.ch> wrote:
>>> [...] You're also excluding roll-on captions then which is a feature of
>>> live broadcasting.
>>
>> It isn't clear to me that an external file would be a good solution for
>> live broadcasting, so I'm not sure this really matters.
>
> The standards-loving Agency for Public Management and eGovernment here
> in Norway are getting their eyes up for HTML5 video (like the rest of
> the world), and are kicking the tires. I've been streaming many
> conferences with Ogg Theora and using Cortado as fallback for legacy
> browsers (+Safari).
>
> Now it has come to a point that we are required to follow the WAI WACG
> requirements. So we have to caption the live video streams/broadcasts.
>
> Given the (not surprising) low support of Timed Tracks for live
> streams in browsers, I'm at this point going to burn the text into the
> video to be shown. However, that is no good solution long term. When
> browsers implement the new startOffsetTime I will be able to send the
> text via a WebSocket to Javascript and have it synced to the video
> (along with the slide images).
>
> However, it would be very nice to be able to send this to the
> caption-track, and not having to reimplement a user interface for
> choosing to see captions etc (I guess user agents will have that).
> Also, I guess there will also be other benefits of streaming directly
> as a timed track, such as the user agent knowing what it is (so that
> it can do smart things with it).
>
> Accessibility is a quite universal requirement, and it would be very
> nice if live streaming could be part of the same framework.
>
>
> Or what other way is there to text such live conferences; or even
> bring real-time metadata from a live video?
>
> Maybe I could even send JSON about the new slides appearing in the
> metadata track? Or even send the slides (images) themselves as
> data-urls in the track?

You should look at WebSRT and the TimedTrack interface defined on
<video> (including in-band captions). It allows both, captions and the
JSON idea. AFAIK no browser has released anything yet to implement it,
but it will be where things are going. So, if you prepare your
interfaces for that to be implemented roughly how it is specified
today, I think you will be fine.

BTW: for Ogg Theora you can create your captions in SRT and encode
them into an Ogg Kate[1] track and they will show in cortado. Browsers
are not using that data yet though. Also, the notes about how Monty
made the recent Xiph video on Digital audio/video is a very good
source of information on current best practice[2] that should be able
to fairly easily move into the next stage of browser support.

Cheers,
Silvia.

[1] http://wiki.xiph.org/OggKate
[2] http://wiki.xiph.org/A_Digital_Media_Primer_For_Geeks_%28episode_1%29/making

Received on Tuesday, 19 October 2010 17:43:14 UTC