W3C home > Mailing lists > Public > public-audio@w3.org > April to June 2012

Re: Web Audio thoughts

From: Chris Rogers <crogers@google.com>
Date: Tue, 12 Jun 2012 10:49:58 -0700
Message-ID: <CA+EzO0kr2z9x52_7ApeizcPtu2FAkpwTeCY3Debos7nN3j8K7A@mail.gmail.com>
To: Ray Bellis <ray@bellis.me.uk>
Cc: public-audio@w3.org
On Tue, Jun 12, 2012 at 9:21 AM, Ray Bellis <ray@bellis.me.uk> wrote:

> Hi Chris & co
> I've been looking at your work on the Web Audio API and been
> experimenting a little with it.
> My current goal is to produce a virtual modular synthesizer using Web
> Audio.  To that end, I'm building a framework where I can create a
> visual graph of AudioNodes and connect them together.
> The current (but *very* early) work in progress is at
> <http://alnitak73.dyndns.org/> and requires Chrome Canary because the
> stable Chrome version doesn't include an Oscillator.
> It's only 200 lines of code at the moment, but with that I'm able to
> wrap an arbitrary AudioNode into a visual representation showing its
> inputs, outputs, and any AudioParam properties it has.  Outputs of nodes
> can be clicked on and then connected to inputs and AudioParams of other
> nodes.

Hi Ray, that sounds very cool!

> To make a full synthesizer I need to build additional pseudo-nodes to
> represent things like envelope generators and other control signals -
> equivalents to CV and gate, as it were.
> I will also need to find ways to describe the controls to existing nodes
> that aren't managed by AudioParams, such as the impulse response for a
> ConvolverNode, or the filter mode for a BiquadFilterNode.
> In any event, I think I can see some things that would be useful, but
> that don't appear to be available in the current API:
> 1.  AudioParam variables
> There's no way I can see to have an AudioParam property in a
> JavascriptAudioNode and then have that node sample the parameter.  The
> JS node is very useful, but without access to AudioParam features it's
> kind of a second class citizen.

Yes, Jussi has brought this up before, and we agreed it could be useful.
But for now at least, we're not adding this part in.  We could add this in
the future...

> For example I might want to implement my ADSR EG using a JS AudioNode.
> In "real" synthesizers it's common to have the ability to tweak the ADSR
> envelope based on input frequency.  I don't think I can do that with the
> current API.

In general, AudioParams can handle this type of thing.  If I can find the
time, I'd like to create some more examples using envelopes to show some of
these things.

By the way, one thing which is now possible is to feed the output from a
JavaScriptAudioNode into a parameter, thus controlling the parameter with
an audio-rate signal generated in JS.  This is kind of the opposite of what
you're describing, but is useful.

> I can't even implement a custom JS oscillator that has the same external
> interface as the standard Oscillator without access to AudioParams.

> 2.  More fine-grained "disconnect"
> With the current API and my framework I can connect specific outputs to
> specific inputs, but the API only allows for disconnecting _every_ link
> from a specific output.  This feels like a significant limitation.

Yes, we've talked a little about this before.  It'll be something we should

> 3.  Interrogation of the node graph
> It would be really handy to be able to ask the AudioContext (or
> individual nodes) for their list of connections.
> I'm not sure how well that sits with "one shot" generator nodes seeing
> as they automatically disconnect themselves, though...

I think we've discussed this before.  Although we could add such an API,
it's not that hard for JS wrapper code to
keep track of these connections.

> I'd appreciate your thoughts on these, and apologise if they've already
> been covered or otherwise addressed in the mailing lists.
> cheers,
> Ray Bellis
Received on Tuesday, 12 June 2012 17:50:28 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:03:05 UTC