[whatwg] Timestamp from video source in order to sync (e.g. expose OGG timestamp to javascript)

On Tue, May 18, 2010 at 1:00 AM, Nikita Eelen <neelen at amvonet.com> wrote:
> I think he means something similar to what QuickTime broadcaster and quicktime streaming
> server does with a delay on a live stream or wowza media server with flash media encoder
> when using h.264, unless I am misunderstanding something. Is that correct Odin? Not sure
> how ice cast deals with it but I bet it's a similar issue,

Yes, I initially used Darwin Streaming Server, but found Icecast2 much
better for *my use*. So I use it in the same way. I'm having Icecast
buffer 1MB worth of data so that it can burst all that to the client
(the browser in this case) so that its own buffering can go faster. So
even there we're quite far behind.

And also, the browsers often stops up a few seconds, and buffers a bit
more, and then continue playing (although they have buffered more than
a few seconds ahead already!), so then they are drifting even further
away from real time.



But I have important news, my bug at Mozilla was closed because they
mean it's actually in the spec already. Because:

> The startTime  attribute must, on getting, return the earliest possible position, expressed in seconds.

And they mean that in a live stream, that would be when I started the
stream (like VLC does). So that the stream in-browser already shows
00:31:30 if we're 31 minutes and 30 seconds into the live stream.

So actually, then the spec is good enough for my uses for synchronising.

You may watch this mozilla bug here:
<https://bugzilla.mozilla.org/show_bug.cgi?id=498253>


However, I think that it's rather hard to find out what the spec
means. Because *earliest POSSIBLE*. What is meant by possible? With
live streaming it is not possible to go further back in the stream.
What do you think? What is meant by this? If it does not help me, then
adding a field for getting the _real_ time code data from the video
would be very usable.

It's talked about in this example:
<http://www.whatwg.org/specs/web-apps/current-work/multipage/video.html#dom-media-starttime>

> For example, if two clips have been concatenated into one video file, but the video format
> exposes the original times for the two clips, the video data might expose a timeline that
> goes, say, 00:15..00:29 and then 00:05..00:38. However, the user agent would not expose
> those times; it would instead expose the times as 00:15..00:29 and 00:29..01:02, as a
> single video.

That's well and good, but it would be nice to get the actual time code
data for live streaming and these syncing uses if startTime is not the
earliest time that exists.


Justin Dolske's idea looks rather nice:
> This seems like a somewhat unfortunate thing for the spec, I bet everyone's
> going to get it wrong because it won't be common. :( I can't help but wonder if
> it would be better to have a startTimeOffset property, so that .currentTime et
> al are all still have a timeline starting from 0, and if you want the "real"
> time you'd use .currentTime + .startTimeOffset.
>
> I'd also suspect we'll want the default video controls to normalize everything
> to 0 (.currentTime - .startTime), since it would be really confusing otherwise.

from <https://bugzilla.mozilla.org/show_bug.cgi?id=498253#c3>

-- 
Beste helsing,
Odin H?rthe Omdal <odin.omdal at gmail.com>
http://velmont.no

Received on Tuesday, 18 May 2010 01:23:34 UTC