W3C home > Mailing lists > Public > public-audio@w3.org > October to December 2011

Web Audio API and Web Workers

From: Jussi Kalliokoski <jussi.kalliokoski@gmail.com>
Date: Thu, 17 Nov 2011 01:15:46 +0200
Message-ID: <CAJhzemXGbMnGwOr88CyuQK91-XG8SUwzn+-13SUm9Kd8PvewEQ@mail.gmail.com>
To: public-audio@w3.org
Hello folks,

Just thought I'd address a bit of hot topic on my mind right now, that
being the performance of full/partial JS audio in Web Audio API. Correct me
if I'm wrong, but currently, if you add a JavaScriptProcessingNode to a
graph, you'll basically expose the whole graph to the performance problems
listed on the specification [1], that the high level API is trying to
shield you from. In my experience, if you're careful with garbage
collection and JS engines keep getting faster, one problem will be
emphasized: the audio processing JS is running in the same thread with any
UI operations, blocking XHRs and whatnot, and thus is very unstable. This
is usually to be avoided in any kind of audio processing, so I've come up
with two alternative or possibly coexisting solutions for that problem:

1) Instead of a callback, you could pass a Web Worker to the
createJavaScriptProcessingNode function (Similarily to Robert's
MediaStreamProcessing API). This worker would have an event called
"onaudioprocess", and the event would get fired with the same arguments
that the JavaScriptProcessingNode's "onaudioprocess" event.

2) Expose the AudioContext API to workers as well.

How do these propositions sound?

In addition, I've been thinking about how these two APIs could co-exist,
rather than compete (because I'm kind of tempted by the use cases they
address together), and say the proposition #2 happened, tada, you could use
the AudioContext functionality in the workers that handle the audio in
MediaStreamsProcessing API. More advanced alternative (yet again, possibly
coexisting) would be that you could pass an AudioContext object to the
MediaStreamProcessing instead of a Worker. That would maybe mean a new node
for AudioContext, something like "createMediaStreamProcessingSourceNode"
and "createMediaProcessingSinkNode" (ugh) that would know to connect itself
to the inputs and outputs of the MediaStreamProcessor.

Cheers,
Jussi

[1]
http://www.w3.org/2011/audio/drafts/1WD/WebAudio/#JavaScriptProcessing-section
Received on Wednesday, 16 November 2011 23:16:14 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 16 November 2011 23:16:14 GMT