[Bug 17415] (JSWorkers): JavaScriptAudioNode processing in workers

https://www.w3.org/Bugs/Public/show_bug.cgi?id=17415

--- Comment #51 from Grant Galitz <grantgalitz@gmail.com> 2012-07-26 15:05:30 UTC ---
(In reply to comment #50)
> Grant, it seems to me that there are at least two options for main-thread audio
> generation even if there's no JavaScriptAudioNode.
> 
> 1. Generate your audio into AudioBuffers and schedule these to play
> back-to-back with AudioBufferSoruceNodes. (I haven't tried if the WebKit
> implementation handles this gapless, but I don't see why we shouldn't support
> this in the spec.)
> 
> 2. Generate your audio into AudioBuffers and postMessage these to a
> WorkerAudioNode. If ownership of the buffer is transferred it should be cheap
> and there's no reason why this should incur a large delay, particularly not
> half a second like you've seen. That sounds like a browser bug to be fixed.
> 
> In both cases one will have one new object per buffer to GC, in the first case
> it's a AudioBufferSourceNode and in the second case it's the event object on
> the worker side.

Option 2 is not viable, due to I/O lag between webworkers and the main thread.
I tried webworker audio gen with the MediaStreamProcessing API (Experimental
API by roc, he even had builds for it) and sent buffers from main->worker and
latency was around 1/3 a second or more.

Option 1 does not make the situation for gapless audio any better here. We're
just making it harder to push out audio. The browser knows best when to fire
audio refills. Forcing the JS code to schedule audio will make audio buffering
and drop outs worse.

-- 
Configure bugmail: https://www.w3.org/Bugs/Public/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are on the CC list for the bug.

Received on Thursday, 26 July 2012 15:05:40 UTC