W3C home > Mailing lists > Public > public-audio@w3.org > July to September 2013

Re: [web-audio-api] (JSWorkers): ScriptProcessorNode processing in workers (#113)

From: Olivier Thereaux <notifications@github.com>
Date: Wed, 11 Sep 2013 07:30:25 -0700
To: WebAudio/web-audio-api <web-audio-api@noreply.github.com>
Message-ID: <WebAudio/web-audio-api/issues/113/24244844@github.com>
> [Original comment](https://www.w3.org/Bugs/Public/show_bug.cgi?id=17415#94) by Grant Galitz on W3C Bugzilla. Fri, 29 Mar 2013 04:15:52 GMT

I remember writing code for my js audio lib to support the mediastream api and being required to output audio in a different thread: https://github.com/grantgalitz/XAudioJS/blob/master/XAudioServerMediaStreamWorker.js

It worked with an experimental firefox build that was published. Problem is, that there's up to a quarter second latency that kills. The end user of a browser will notice a giant time delay between an in-game event and the audio for it.

I have no problems with having a spec that has off thread audio, just don't cripple stuff by removing the on-thread audio. It's been mentioned many times that there is work on canvas support in-workers, so that it can be done alongside the audio out of the UI thread. The problem I have with that is it complicates keeping the audio libraries separate from the code that uses it. I want to support legacy audio APIs that use the main UI thread, and this will complicate that greatly.

---
Reply to this email directly or view it on GitHub:
https://github.com/WebAudio/web-audio-api/issues/113#issuecomment-24244844
Received on Wednesday, 11 September 2013 14:39:32 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:50:11 UTC