Time stamps in AudioProcessingEvent

Hi all,

In Chrome Canary as well as in Firefox, the AudioProcessingEvent delivered
to a script processor node's onaudioprocess method has both a "timeStamp" 
property and a "playbackTime" property. The timeStamp property is a time
in milliseconds in the same coordinate system as Date.now(), while
playbackTime is a time in the coordinate system of AudioContext.currentTime.

It almost looks like these two times can be used to synchronize events in 
the web audio api world with those in the animation world, barring dynamic 
changes to the audio route that change the output latency. 

... but the timeStamp property is not part of the latest TR spec. Can we have 
some clarification on this?

IMHO, the inability to synchronize visual events with audio events precisely
is a very large hole in the API. With MacOSX, for example, I found my app's
visuals completely out of sync with the audio when I had the audio output 
piped to AirPlay. (I apologize I'm unable to verify this right now since 
I run Yosemite and I'm unable to use AirPlay.)

Best,
-Kumar

Received on Tuesday, 17 June 2014 14:13:19 UTC