- From: Olivier Thereaux <notifications@github.com>
- Date: Wed, 11 Sep 2013 07:30:18 -0700
- To: WebAudio/web-audio-api <web-audio-api@noreply.github.com>
Received on Wednesday, 11 September 2013 14:39:00 UTC
> [Original comment](https://www.w3.org/Bugs/Public/show_bug.cgi?id=17415#84) by Chris Rogers on W3C Bugzilla. Mon, 30 Jul 2012 23:45:01 GMT I think there's been a misunderstanding that somehow the JavaScript code rendering audio in a JavaScriptAudioNode callback will block the audio thread! This is not the case. An implementation should use buffering (producer/consumer model) where the JS thread produces and the audio thread consumes (with no blocking). This is how it's implemented in WebKit. Additionally, the JS callbacks should all be clocked/scheduled from the audio system (in the implementation), and not rely on setTimeout() or require client polling/querying of a timestamp from javascript (which is a much less ideal approach). --- Reply to this email directly or view it on GitHub: https://github.com/WebAudio/web-audio-api/issues/113#issuecomment-24244774
Received on Wednesday, 11 September 2013 14:39:00 UTC