- From: Srikumar K. S. <srikumarks@gmail.com>
- Date: Tue, 17 Jun 2014 19:42:37 +0530
- To: Audio WG <public-audio@w3.org>
- Message-Id: <48C8044C-E150-4609-8539-23AA79C99C04@gmail.com>
Hi all, In Chrome Canary as well as in Firefox, the AudioProcessingEvent delivered to a script processor node's onaudioprocess method has both a "timeStamp" property and a "playbackTime" property. The timeStamp property is a time in milliseconds in the same coordinate system as Date.now(), while playbackTime is a time in the coordinate system of AudioContext.currentTime. It almost looks like these two times can be used to synchronize events in the web audio api world with those in the animation world, barring dynamic changes to the audio route that change the output latency. ... but the timeStamp property is not part of the latest TR spec. Can we have some clarification on this? IMHO, the inability to synchronize visual events with audio events precisely is a very large hole in the API. With MacOSX, for example, I found my app's visuals completely out of sync with the audio when I had the audio output piped to AirPlay. (I apologize I'm unable to verify this right now since I run Yosemite and I'm unable to use AirPlay.) Best, -Kumar
Attachments
- application/pkcs7-signature attachment: smime.p7s
Received on Tuesday, 17 June 2014 14:13:19 UTC