Re: [web-audio-api] Need a way to determine AudioContext time of currently audible signal (#12)

> [Original comment](https://www.w3.org/Bugs/Public/show_bug.cgi?id=20698#1) by Chris Wilson on W3C Bugzilla. Tue, 02 Apr 2013 14:30:39 GMT

Can we clearly delineate?  I'm not positive I understand what "latency discovery" is, because there's one bit of information (the average processing block size) that might be interesting, but I intended this issue to cover the explicit "I need to synchronize between the audio time clock and the performance clock at a reasonably high precision - that is, for example:

1) I want to be playing a looped sequence through Web Audio; when I get a timestamped MIDI message (or keypress, for that matter), I want to be able to record it and play that sequence back at the right time.

2) I want to be able to play back a sequence of combined MIDI messages and Web Audio, and have them synchronized to a sub-latency level (given the latency today on Linux and even Windows, this is a requirement).  Even if my latency of Web Audio playback is 20ms, I should be able to pre-schedule MIDI and audio events to occur within a millisecond or so of each other.

Now, there's a level of planning for which knowing the "average latency" - related to processing block size, I imagine - would be interesting (I could use that to pick a latency in my scheduler, for example); but that's not the same thing.  Perhaps these should be solved together, but I don't want the former to be dropped in favor of the latter.

---
Reply to this email directly or view it on GitHub:
https://github.com/WebAudio/web-audio-api/issues/12#issuecomment-24244076

Received on Wednesday, 11 September 2013 14:32:00 UTC