Re: [web-audio-api] (JSWorkers): ScriptProcessorNode processing in workers (#113)

> [Original comment](https://www.w3.org/Bugs/Public/show_bug.cgi?id=17415#92) by Srikumar Subramanian (Kumar) on W3C Bugzilla. Mon, 06 Aug 2012 15:39:43 GMT

Many points have been raised for and against running JS audio code in web workers. This is an important issue for developers (like me) and so I thought having a summary of the points raised thus far might be  useful [with some extra notes in square brackets]. Hope this helps reach consensus.

Please add/correct/clarify as you see fit.

== Arguments for JS audio in workers ==

1. Audio calculation will not be interrupted by the goings on on the main thread such as GC, graphics, layout reflow, etc.

2. Possibility for tighter integration with the audio thread. Perhaps we will be able to remove the one buffer delay currently in place? [Seems unlikely based on what ChrisR says - "cannot have blocking calls in system audio callbacks". Also cannot risk JS code in high priority system callbacks.]

3. Permits multiple audio contexts to operate independent of each other.  If all JS audio is on the main thread, then the JS nodes of all audio contexts are multiplexed on to the same thread. [Thus an offline context that uses a JS node will interfere with a realtime context if all JS audio code runs on the main
thread.]

4. With a "main thread only" design for JS audio nodes, applications will not be able to control the effects on audio of the browser multiplexing multiple tabs into a single thread - i.e. audio generation/processing can be affected by what happens in other browser tabs, with no scope for developer control.

== Arguments against JS audio in workers ==

1. API access to JS audio code will be severely limited. We may need to lobby for access to specific APIs (ex: WebCL).

2. More complicated to program. DOM and JS state access will require additional communication with the main thread.

3. Event communication latency between the main thread and workers is currently high (up to 0.5s). The communication overhead is also dependent on OS characteristics and consistent behaviour between OSes may be hard to implement. [Perhaps because workers are intended for "background" tasks and so minimizing the communication latency for events from the main thread to a worker isn't a priority for browser vendors?]

4. JS audio in workers cannot be synchronously executed with the high priority audio thread for security reasons. So the same delay in the current webkit implementation of JS audio nodes will likely apply.

5. Some applications such as device emulators become hard or impossible to program if all JS audio were to be forced to run in workers.

6. [If JS audio code is to be exclusively run in workers, then that demands that browser implementations cannot include support for the web audio api unless they also have support for web workers. Mobile browsers may choose not to support web workers but will likely want to support web audio.]

== Options ==

1. Permit JS audio nodes that run in the main thread *and* have a separate node type to run code in workers. A suggested con is that developers will tend to take the easier route which in this case (i.e. JS on main thread) is less reliable.  OTOH, "give them some rope and some will hang themselves and others
will build bridges" (Jussi).

2. It may be possible and acceptable to use AudioBufferSourceNodes to implement alternative schemes for "pumping" audio to the output, albeit at higher latencies.

3. [Live with JS on main thread for v1 of the spec and postpone worker nodes to later versions after prototyping. JS nodes on main thread are quite usable with long enough buffers.]

---
Reply to this email directly or view it on GitHub:
https://github.com/WebAudio/web-audio-api/issues/113#issuecomment-24244819

Received on Wednesday, 11 September 2013 14:30:58 UTC