W3C home > Mailing lists > Public > public-audio@w3.org > July to September 2013

Re: [web-audio-api] Need a way to determine AudioContext time of currently audible signal (#12)

From: Olivier Thereaux <notifications@github.com>
Date: Wed, 11 Sep 2013 07:29:22 -0700
To: WebAudio/web-audio-api <web-audio-api@noreply.github.com>
Message-ID: <WebAudio/web-audio-api/issues/12/24244079@github.com>
> [Original comment](https://www.w3.org/Bugs/Public/show_bug.cgi?id=20698#2) by Joe Berkovitz / NF on W3C Bugzilla. Tue, 02 Apr 2013 16:10:16 GMT

This bug is intended to cover both the MIDI-synchronization cases that you proposed and also the original case I raised, which involved the placement of a visual cursor that is synchronized with audio that's being heard at the same time.

In the original visual case, the main thread needs to be able to determine the original "audio scheduling" time (in the context time frame used by start(), setValueAtTime(), etc.) for the audio signal presently emerging from the speaker. AudioContext.currentTime does not supply this time, as I explained in my original bug description.

I am not interested in the average latency or processing block size and agree that would be a different bug.

---
Reply to this email directly or view it on GitHub:
https://github.com/WebAudio/web-audio-api/issues/12#issuecomment-24244079
Received on Wednesday, 11 September 2013 14:29:46 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:50:11 UTC