JavaScriptNode interface

Matt Paradis and I have been thinking about the interface to the
JavaScript node, after noticing a difference between the current
stable Chrome version of the API and the one in the Canary
(development) build.

In the former a JavaScriptNode could be created like so:

: node = createJavaScriptNode(1024, 0, 1)

For a node with a single output channel. The individual streams in the
channel could be addressed inside the process callback using

: e.outputBuffer.getChannelData(0)
: e.outputBuffer.getChannelData(1)

and so on. Chris R mentioned earlier that this is due to the current
implementation defaulting to 2 streams irrespective of the arguments
provided to createJavaScriptNode. In the Canary implementation this
has been changed so that the above code only works (getChannelData(1)
returns Null otherwise) if the javascript node is created with

: node = createJavaScriptNode(1024, 0, 2)

That is, 2 *streams* are specified in the output.

It is unclear to us how this interface will work in the case where a
number of streams are multiplexed into a number of channels. For
example say two stereo (2-stream) channels are to be processed inside
a javascript node; would the node be instantiated with 4 inputs (streams)

: node = createJavaScriptNode(1024, 4, 4)

or two inputs (channels)

: node = createJavaScriptNode(1024, 2, 4)

?

How would the individual streams within each channel be accessed
within the node? One solution would be for getChannelData to return an
array of stream objects making it possible to iterate over each stream
in a channel. This would allow a javascript node to be written to work
with channels containing any number of streams. The interface to
createJavaScriptNode would then accept the number of channels rather
than the number of streams.

Looking at the webkit source [1] which deals with connecting channels
containing different numbers of streams together, it looks like the
intention is to make it simple to work with audio processing graphs
without caring about the number of streams on the input and output
sides. In the case of the javascript node and some examples we are
working on dealing with synthesis rather than processing of audio
sources, it makes things quite a bit more complicated.

My personal preference would be for the convention of multiplexing
streams into channels to be dropped, and for all nodes to consider
stereo sources as having 2 "channels", 5.1 sources as having 6
"channels" and so on. You could determine the number of input and
output channels by querying the sources and destinations.

Best regards,

Chris



[1] 
http://trac.webkit.org/browser/trunk/Source/WebCore/platform/audio/AudioBus.cpp

Received on Wednesday, 2 May 2012 13:19:49 UTC