Synchronizing audio with other events ...

Hi all,

Which API features do I use to get (for ex.) MIDI and audio events synchronously triggered to within 1ms  of precision, given a precise enough MIDI scheduler implementation? 

With the current state of the API spec, it appears that the precision to which I can synchronize scheduled MIDI and audio is limited by the buffer size used by the audio API implementation. Unless I'm missing something (in which case I'm happy to have a solution), this appears to be a gap in the API - one that can be filled pretty easily.

Regarding synchronization of audio events with visuals at 60fps, current implementations of the Web Audio API update the AudioContext.currentTime frequently enough that checking "currentTime" within a requestAnimationFrame callback is adequate for most purposes (though not ideal). At a sample rate of 44100Hz, 15ms corresponds to 735 samples and current buffer sizes in implementations are 256 samples or fewer, afaik. By the same token, I would need currentTime to update multiple times every 1ms to get that precision with MIDI, which is too much to ask.

However, if along with AudioContext.currentTime I also have the DOMTimeStamp of the time at which the sample corresponding to currentTime will exit the audio subsystem, then the precision to which I can schedule MIDI and audio will not be limited by the audio buffer size any more. Indeed, the buffer size can be 2048 samples for all I care (though event-to-sound latency will suffer at that size). This will also permit better synchronization with visuals.

Is there currently any way to calculate this DOMTimeStamp value or is this indeed a gap?

A minimal test case for this might be - map a key on the keyboard to trigger, with a delay of 1 sec, MIDI 'C' on channel1 simultaneously with an oscillator-based note and check whether the time difference between the onset of the two sounds is a constant independent of when the key is pressed.

Thanks.
-Kumar

Received on Friday, 31 May 2013 11:24:47 UTC