Re: a sample to use Web Worker with current JavaScriptAudioNode

Hi James, thanks for the example.  It's true that we can currently use web
workers in the style of your example.  But to get better performance, we
will need to have the callback function ("audioProcessor" in your case) be
*directly* a function invoked in the web worker thread.  The problem with
the indirect approach is that it involves even one more thread "hop" from
the main thread to the worker thread, so there is increased latency and
problems with glitching.  We should be able to arrange the audio thread to
*directly* call a function in the worker thread.  But the technical details
of how to do this are beyond my understanding.  I will try to figure out
which people are currently working on this code in WebKit so we can get
their assistance...

Chris

On Tue, Apr 17, 2012 at 5:19 AM, Wei, James <james.wei@intel.com> wrote:

>  This is a simple sample to use web worker with JavaScriptAudioNode. It
> just silent the left channel and double the right channel. ****
>
> http://web-audio.appspot.com/ ****
>
> ** **
>
> Although it can work, it has some issues including logic etc. ****
>
> ** **
>
> **1.       **Have to cache the event so in the audioResponse we can
> access the event.outputBuffer. The event can be override by the following
> events. ****
>
> **2.       **audioProcessor may return before the worker completed the
> work, there is potential issue for asynchronous vs. synchronous. ****
>
> ** **
>
> In the main.js ****
>
> ** **
>
> function audioProcessor(e) {****
>
>     console.log('audioProcessor ' + eventCount++);****
>
>     audioEvent = e;****
>
> ** **
>
>     var li = audioEvent.inputBuffer.getChannelData(0);****
>
>     var ri = audioEvent.inputBuffer.getChannelData(1);****
>
> ** **
>
>     worker.postMessage({left:li, right:ri}, [li.buffer, ri.buffer]);****
>
> }****
>
> ** **
>
> function audioResponse(e) {****
>
>     console.log('audioResponse ' + responseCount++);****
>
>     var data = e.data;****
>
> ** **
>
>     var l = data.left;****
>
>     var r = data.right;****
>
> ** **
>
>     var lo = audioEvent.outputBuffer.getChannelData(0);****
>
>     var ro = audioEvent.outputBuffer.getChannelData(1);****
>
> ** **
>
>     for (var i = 0; i < l.length; i++) {****
>
>         lo[i] = l[i];****
>
>         ro[i] = r[i];****
>
>     }****
>
> }****
>
> ** **
>
> function process() {****
>
>     var source = context.createBufferSource();****
>
>     var jsnode = context.createJavaScriptNode(4096, 2, 2);****
>
>     jsnode.onaudioprocess = audioProcessor; ****
>
>     source.buffer = buffer;****
>
>     source.connect(jsnode);****
>
>     jsnode.connect(context.destination);****
>
>     source.noteOn(0);****
>
> }****
>
> ** **
>
> In worker.js****
>
> ** **
>
> self.addEventListener('message', process, false);****
>
> ** **
>
> function process(e) {****
>
>     var data = e.data;****
>
> ** **
>
>     if (data != undefined) {****
>
>         var li = data.left;****
>
>         var ri = data.right;****
>
> //****
>
>         var lout = new Float32Array(li.length);****
>
>         var rout = new Float32Array(ri.length);****
>
> //****
>
>         for (var i = 0; i < li.length; i++) {****
>
>             lout[i] = 0;****
>
>             rout[i] = ri[i] * 2;****
>
>         }****
>
>     ****
>
>         var message = {left: lout, right: rout};****
>
> ** **
>
>         self.postMessage(message, [lout.buffer, rout.buffer]);****
>
>     }****
>
> }****
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> Best Regards ****
>
> ** **
>
> James ****
>
> ** **
>
> ** **
>

Received on Tuesday, 17 April 2012 17:15:16 UTC