Re: [web-audio-api] Need a way to determine AudioContext time of currently audible signal (#12)

> [Original comment](https://www.w3.org/Bugs/Public/show_bug.cgi?id=20698#3) by Ehsan Akhgari [:ehsan] on W3C Bugzilla. Tue, 02 Apr 2013 18:40:22 GMT

I believe we're talking about two sources of latency here, one is the clock drift between what we measure on the main thread through AudioContext.currentTime and the actual clock on the audio thread, and the other latency is between the "play" call from the audio thread to the point where the OS actually starts to hand off the buffer to the sound card (and another one of potentially a delay until your speakers start to play out what was received on the sound card.)  With all of that, if the implementation also uses system level APIs which do not provide enough resolution (as is the case on Windows XP, for example), there is another artificial latency that is introduced in the calculations because of the unability to measure time precisely enough.

The use case of syncing the display of something on the screen with sound coming out of speakers is very hard to satisfy, since browsers generally do not provide any guarantee on when the updates resulting from a change in the DOM tree or a Web API call will be reflected on the screen.  On an implementation which strives to provide a 60fps rendering, this delay can be as high as 16ms in the best case, and much more than that if the implementation is suffering from frame misses.  So, no matter what API we provide here, there will _always_ be a delay involved in getting stuff on the screen.

For the MIDI use case, I imagine knowing the latest measured drift from the audio thread clock and what AudioContext.currentTime returns should be enough, right?

---
Reply to this email directly or view it on GitHub:
https://github.com/WebAudio/web-audio-api/issues/12#issuecomment-24244084

Received on Wednesday, 11 September 2013 14:32:45 UTC