W3C home > Mailing lists > Public > public-audio@w3.org > July to September 2015

Synchronizing audio with other events ...

From: Kumar <srikumarks@gmail.com>
Date: Sun, 13 Sep 2015 08:45:49 +0530
Message-ID: <CA+bS6j6uYoXDCyo-9LkPjbhihowwx5Hu9FS8WBMKu9=kWqAy4Q@mail.gmail.com>
To: "public-audio@w3.org" <public-audio@w3.org>
Hi all,

This is to recap an earlier conversation on synchronizing audio precisely
with MIDI events and visuals when we're scheduling these a little into the


Am I right in noticing that we don't yet have a solution that reliably maps
between AudioContext.currentTime and the DOMHiresTimeStamp value gotten
through performance.now() or requestAnimationFrame? (This doesn't of course
apply to the OfflineAudioContext.)

Given that the spec says currentTime "increases in realtime", the inability
to connect it with DOMHiresTimeStamp makes precision scheduling of MIDI and
visuals hard or impossible to do reliably.

This issue is only somewhat related to getting latency info. It would be
possible to construct an API for this that includes the latency info (for
ex: context.currentRealTime - performance.now()) or have the latency be
provided separately. Either way, a mapping is indispensable I think.

Received on Sunday, 13 September 2015 03:16:45 UTC

This archive was generated by hypermail 2.3.1 : Sunday, 13 September 2015 03:16:46 UTC