W3C home > Mailing lists > Public > public-audio@w3.org > July to September 2015

Re: Synchronizing audio with other events ...

From: Russell McClellan <russell.mcclellan@gmail.com>
Date: Sun, 13 Sep 2015 10:18:45 -0400
Message-ID: <CAELY0NRgoYjz2g-Q1jY4pMDm6a2e-+kpG=cBCMgBdOnWC1vCWg@mail.gmail.com>
To: Srikumar Subramanian <srikumarks@gmail.com>
Cc: "public-audio@w3.org" <public-audio@w3.org>
It looks like the latest suggestion from the github thread tracking
this issue (https://github.com/WebAudio/web-audio-api/issues/12),
would cover your use case, right?


On Sat, Sep 12, 2015 at 11:15 PM, Kumar <srikumarks@gmail.com> wrote:
> Hi all,
> This is to recap an earlier conversation on synchronizing audio precisely
> with MIDI events and visuals when we're scheduling these a little into the
> future.
> http://lists.w3.org/Archives/Public/public-audio/2013AprJun/0456.html
> Am I right in noticing that we don't yet have a solution that reliably maps
> between AudioContext.currentTime and the DOMHiresTimeStamp value gotten
> through performance.now() or requestAnimationFrame? (This doesn't of course
> apply to the OfflineAudioContext.)
> Given that the spec says currentTime "increases in realtime", the inability
> to connect it with DOMHiresTimeStamp makes precision scheduling of MIDI and
> visuals hard or impossible to do reliably.
> This issue is only somewhat related to getting latency info. It would be
> possible to construct an API for this that includes the latency info (for
> ex: context.currentRealTime - performance.now()) or have the latency be
> provided separately. Either way, a mapping is indispensable I think.
> Best,
> -Kumar
Received on Sunday, 13 September 2015 14:19:13 UTC

This archive was generated by hypermail 2.3.1 : Sunday, 13 September 2015 14:19:14 UTC