Re: Synchronizing audio with other events ...

It looks like the latest suggestion from the github thread tracking
this issue (https://github.com/WebAudio/web-audio-api/issues/12),
would cover your use case, right?

Thanks,
-Russell

On Sat, Sep 12, 2015 at 11:15 PM, Kumar <srikumarks@gmail.com> wrote:
> Hi all,
>
> This is to recap an earlier conversation on synchronizing audio precisely
> with MIDI events and visuals when we're scheduling these a little into the
> future.
>
> http://lists.w3.org/Archives/Public/public-audio/2013AprJun/0456.html
>
> Am I right in noticing that we don't yet have a solution that reliably maps
> between AudioContext.currentTime and the DOMHiresTimeStamp value gotten
> through performance.now() or requestAnimationFrame? (This doesn't of course
> apply to the OfflineAudioContext.)
>
> Given that the spec says currentTime "increases in realtime", the inability
> to connect it with DOMHiresTimeStamp makes precision scheduling of MIDI and
> visuals hard or impossible to do reliably.
>
> This issue is only somewhat related to getting latency info. It would be
> possible to construct an API for this that includes the latency info (for
> ex: context.currentRealTime - performance.now()) or have the latency be
> provided separately. Either way, a mapping is indispensable I think.
>
> Best,
> -Kumar
>
>
>
>

Received on Sunday, 13 September 2015 14:19:13 UTC