[whatwg] Timing API proposal for measuring intervals

On Tue, Jul 12, 2011 at 7:23 AM, Chris Rogers <crogers at google.com> wrote:

> In the CoreAudio case, the AudioTimeStamp contains *both* the host-time
> (system clock) and the sample time (based on audio hardware).  This creates
> a relationship between the two clocks.  As an example of how these two
> clocks can be used together for synchronization, audio applications use the
> high-resolution timestamp of incoming MIDI messages to schedule audio
> synthesis to happen with very low jitter by doing sample-accurate scheduling
> when rendering the audio stream.
>
> Because of clock-drift, the system clock that James is proposing cannot
> *directly* be the same clock as what I'm proposing in the Web Audio API
> AudioContext .currentTime attribute.  But there are ways to translate
> between the two in very useful ways.
>

Thanks Chris, I thought you'd know the answer :-).

Suppose we wanted to sync animation (either scripted animation or CSS
animation) to audio. We'd want them to use the audio clock, right?

Rob
-- 
"If we claim to be without sin, we deceive ourselves and the truth is not in
us. If we confess our sins, he is faithful and just and will forgive us our
sins and purify us from all unrighteousness. If we claim we have not sinned,
we make him out to be a liar and his word is not in us." [1 John 1:8-10]

Received on Monday, 11 July 2011 17:57:56 UTC