[whatwg] Timestamp from video source in order to sync (e.g. expose OGG timestamp to javascript)

On Tue, May 18, 2010 at 7:31 AM, Odin Omdal H?rthe <odin.omdal at gmail.com> wrote:
> Hello!
>
> I filed bugs at mozilla and in chromium because I want to sync real
> time data stream to live video. Some of them told me to send it here
> as well. :-)
>
> It's only possible to get relative playtime with html5 in javascript. I
> want absolute timestamp that's embeded in OGG.
>
> The spec only deals with relative times, and not getting out
> information from the
>
> Here's the deal:
> I stream conferences using Ogg Theora+Vorbis using Icecast2. I have built a
> site that shows the video and then automatically shows the slides (as PNG
> files) as well. I use orbited (COMET) to have the server PUSH my ?next?
> presses on my keyboard.
>
> The problem is that icecast does heavy buffering, and also the client, so
> that while I switch the slides, the browser will go from slide 3 to 4 WAY
> too early (from 10 second to 1 minute).
>
> If I could get the timestamp OR time-since-started-sending/recording from
> the ogg file in javascript, I'd be able to sync everything.
>
> There are multiple way to sync this, may even an stream with the slide-data
> INSIDE the ogg file, however, AFAIK there's also no way of getting out such
> arbitrary streams.

There is a Multitrack API proposal at
http://www.w3.org/WAI/PF/HTML/wiki/Media_MultitrackAPI to make the
tracks of a media resource available to the browser. But it hasn't
progressed into the spec yet.

Cheers,
Silvia.

Received on Monday, 17 May 2010 18:56:55 UTC