[web-audio-api] Need a way to determine AudioContext time of currently audible signal (#12)

> Originally reported on W3C Bugzilla [ISSUE-20698](https://www.w3.org/Bugs/Public/show_bug.cgi?id=20698) Thu, 17 Jan 2013 14:15:09 GMT
> Reported by Joe Berkovitz / NF
> Assigned to 

Use case:

If one needs to display a visual cursor in relationship to some onscreen representation of an audio timeline (e.g. a cursor on top of music notation or DAW clips) then knowing the real time coordinates for what is coming out of the speakers is essential.

However on any given implementation an AudioContext's currentTime may report a time that is somewhat ahead of the time of the actual audio signal emerging from the device, by a fixed amount.  If a sound is scheduled (even very far in advance) to be played at time T, the sound will actually be played when AudioContext.currentTime = T + L where L is a fixed number.

On Jan 16, 2013, at 2:05 PM cwilso@google.com wrote:

It's problematic to incorporate scheduling other real-time events (even knowing precisely "what time it is" from the drawing function) without a better understanding of the latency.

The idea we reached (I think Chris proposed it, but I can't honestly remember) was to have a performance.now()-reference clock time on AudioContext that would tell you when the AudioContext.currentTime was taken (or when that time will occur, if it's in the future); that would allow you to synchronize the two clocks.  The more I've thought about it, the more I quite like this approach - having something like AudioContext.currentSystemTime in window.performance.now()-reference.

On Jan 16, 2013, at 3:18 PM, Chris Rogers <crogers@google.com> wrote:

the general idea is that the underlying different platforms/OSs can have very different latency characteristics, so I think you're looking for a way to query the system to know what it is.  I think that something like AudioContext.presentationLatency is what we're looking for.  Presentation latency is the time difference between when you tell an event to happen and the actual time when you hear it.  So, for example, with source.start(0), you would hope to hear the sound right now, but in reality will hear it with some (hopefully) small delay.  One example where this could be useful is if you're trying to synchronize a visual "playhead" to the actual audio being scheduled...

I believe the goal for any implementation should be to achieve as low a latency as possible, one which is on-par with desktop/native audio software on the same OS/hardware that the browser is run on.  That said, as with other aspects of the web platform (page rendering speed, cache behavior, etc.) performance is something which is tuned (and hopefully improved) over time for each browser implementation and OS.

---
Reply to this email directly or view it on GitHub:
https://github.com/WebAudio/web-audio-api/issues/12

Received on Wednesday, 11 September 2013 14:28:39 UTC