W3C home > Mailing lists > Public > public-audio@w3.org > July to September 2013

Re: [web-audio-api] Need a way to determine AudioContext time of currently audible signal (#12)

From: Olivier Thereaux <notifications@github.com>
Date: Wed, 11 Sep 2013 07:29:30 -0700
To: WebAudio/web-audio-api <web-audio-api@noreply.github.com>
Message-ID: <WebAudio/web-audio-api/issues/12/24244179@github.com>
> [Original comment](https://www.w3.org/Bugs/Public/show_bug.cgi?id=20698#20) by Chris Wilson on W3C Bugzilla. Tue, 16 Apr 2013 16:23:11 GMT

Gah.  That's not what the current title asks for - average latency is a fine thing to want to know, but it doesn't address the precise synchronization need I'd mentioned in the email at the top of this message that would let authors synchronize MIDI and audio, or on-screen and audio.  I think the errors would quite possibly be audible, for MIDI (because you can hear a < 16ms error, even if you can't see it), depending on how frequently JS code could be called with the same currentTime (related to block size? I'm not sure, given what Ehsan said about their processing mechanism, that you wouldn't be able to see visual sync errors with only average latency, if the block processing is >16.7ms on a slow system.).

I'd suggest a title of "Need to expose average latency of system", and then I'll go file the "Need to expose time stamp of currentTime" issue that is necessary for synchronization with MIDI.  I'd actually rather have this bug represent that issue, given the long background thread, but I can link them.

Reply to this email directly or view it on GitHub:
Received on Wednesday, 11 September 2013 14:33:01 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:03:24 UTC