[whatwg] VIDEO Timeupdate event frequency.

On Fri, Sep 10, 2010 at 4:05 AM, Silvia Pfeiffer
<silviapfeiffer1 at gmail.com> wrote:
> On Fri, Sep 10, 2010 at 7:28 PM, Biju <bijumaillist at gmail.com> wrote:
>>
>> https://bugzilla.mozilla.org/show_bug.cgi?id=571822
>> > Firefox fires the timeupdate event once per frame.
>> > Safari 5 and Chrome 6 fire every 250ms. ?Opera 10.50 fires every 200ms.
>>
>>
>> Now in firefox bug 571822 they are changing Firefox fires the
>> timeupdate event at every 250ms
>>
>> But this takes away control of somebody who want to do some image
>> process on every frame, as well as miss frames.
>>
>> So can we have a "newFrame" event and/or a "minTimeupdate" property to
>> say what should be the minimum time interval between consecutive
>> timeupdate event.
>
> If we have a newFrame event, might it be an idea to actually hand over the
> frame data (audio + video) in the event? I would think that only ppl that
> want to do manipulations on the media data want to have that kind of
> resolution and it might be more efficient to just provide the data with the
> event?

That would actually be a rather useful property.  I have several
examples of video/canvas integration that I show off regularly at
talks (and will have an article about on html5doctors.com soon), where
I just listen to the play event and start running a function every
20ms, stopping when I see that the video is stopped or paused.  Just
being able to register the function with a newFrame event instead
would be useful in terms of avoiding unnecessary computation, and
getting the data directly rather than having to draw the video into a
backing canvas and then ask for its ImageData would shave some of the
complexity off of the code.

How should it return the data?  Perhaps the video data as an ImageData
object?  I don't know how audio would be returned, though.

~TJ

Received on Friday, 10 September 2010 07:53:23 UTC