- From: Kumar <srikumarks@gmail.com>
- Date: Sun, 13 Sep 2015 08:45:49 +0530
- To: "public-audio@w3.org" <public-audio@w3.org>
Received on Sunday, 13 September 2015 03:16:45 UTC
Hi all, This is to recap an earlier conversation on synchronizing audio precisely with MIDI events and visuals when we're scheduling these a little into the future. http://lists.w3.org/Archives/Public/public-audio/2013AprJun/0456.html Am I right in noticing that we don't yet have a solution that reliably maps between AudioContext.currentTime and the DOMHiresTimeStamp value gotten through performance.now() or requestAnimationFrame? (This doesn't of course apply to the OfflineAudioContext.) Given that the spec says currentTime "increases in realtime", the inability to connect it with DOMHiresTimeStamp makes precision scheduling of MIDI and visuals hard or impossible to do reliably. This issue is only somewhat related to getting latency info. It would be possible to construct an API for this that includes the latency info (for ex: context.currentRealTime - performance.now()) or have the latency be provided separately. Either way, a mapping is indispensable I think. Best, -Kumar
Received on Sunday, 13 September 2015 03:16:45 UTC