W3C home > Mailing lists > Public > public-audio@w3.org > April to June 2012

Re: Number of channels in the JavaScriptAudioNode

From: Raymond Toy <rtoy@google.com>
Date: Tue, 3 Apr 2012 10:47:51 -0700
Message-ID: <CAE3TgXEcpgjZV5OHMGSTJLkW8tV8dmyLfm=yto=MuFRCxhVTcQ@mail.gmail.com>
To: Chris Lowis <chris.lowis@bbc.co.uk>
Cc: public-audio@w3.org
On Tue, Apr 3, 2012 at 8:31 AM, Chris Lowis <chris.lowis@bbc.co.uk> wrote:

> Hi,
> In the process of working up some simple synthesis examples, I've come
> across some behaviour I don't understand. Perhaps Chris, Raymond or someone
> familiar with the Web Audio API can help.
> I'm trying to create a JavaScriptAudioNode that adds two mono signals
> together (generated by my javascript node-based sine wave generator).
> When I execute:
>  context = new webkitAudioContext
>  node = context.createJavaScriptNode(**1024, 2, 1)
> I see that 'node' has
>  numberOfInputs: 1
>  numberOfOutputs: 1
> (at least in Chrome 18.0.1025.142)
> Is it possible at the moment to create javascript nodes with multiple
> inputs and outputs?

I don't think that's possible right now.  I think that's a feature to be
added in the future.

> In general what's a good way of working with mono signals? I admit to
> being a little unsure of the best way to use the various splitter and
> merger nodes.
I have not tried this, but it seems you can hook up your two mono signals
to a AudioChannelMerger.  Perhaps something like

merger = context.createChannelMerger();

The output of the merger will be a stereo signal that your javascript node
can access and process.

Does that work?  I haven't tried any of this out, so I don't know if it
works or not.

Received on Tuesday, 3 April 2012 17:48:26 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:03:04 UTC