- From: Chris Wilson <cwilso@google.com>
- Date: Mon, 23 Jun 2014 10:51:33 -0700
- To: "Srikumar K. S." <srikumarks@gmail.com>
- Cc: Audio WG <public-audio@w3.org>
- Message-ID: <CAJK2wqW7os3J2FDQiLv3RnG9LhNwG=rV94T20BU26oy58OOdhQ@mail.gmail.com>
"timeStamp" is part of the DOM level 2 Event API. It captures the time at which the event was created, NOT the "current time in the audio stream, in Date.now() coordinate system", which is what you really want. On Tue, Jun 17, 2014 at 7:12 AM, Srikumar K. S. <srikumarks@gmail.com> wrote: > Hi all, > > In Chrome Canary as well as in Firefox, the AudioProcessingEvent delivered > to a script processor node's onaudioprocess method has both a "timeStamp" > property and a "playbackTime" property. The timeStamp property is a time > in milliseconds in the same coordinate system as Date.now(), while > playbackTime is a time in the coordinate system of > AudioContext.currentTime. > > It almost looks like these two times can be used to synchronize events in > the web audio api world with those in the animation world, barring dynamic > changes to the audio route that change the output latency. > > ... but the timeStamp property is not part of the latest TR spec. Can we > have > some clarification on this? > > IMHO, the inability to synchronize visual events with audio events > precisely > is a very large hole in the API. With MacOSX, for example, I found my app's > visuals completely out of sync with the audio when I had the audio output > piped to AirPlay. (I apologize I'm unable to verify this right now since > I run Yosemite and I'm unable to use AirPlay.) > > Best, > -Kumar > > >
Received on Monday, 23 June 2014 17:52:04 UTC