- From: Ray Bellis <ray@bellis.me.uk>
- Date: Tue, 12 Jun 2012 17:21:52 +0100
- To: public-audio@w3.org
Hi Chris & co I've been looking at your work on the Web Audio API and been experimenting a little with it. My current goal is to produce a virtual modular synthesizer using Web Audio. To that end, I'm building a framework where I can create a visual graph of AudioNodes and connect them together. The current (but *very* early) work in progress is at <http://alnitak73.dyndns.org/> and requires Chrome Canary because the stable Chrome version doesn't include an Oscillator. It's only 200 lines of code at the moment, but with that I'm able to wrap an arbitrary AudioNode into a visual representation showing its inputs, outputs, and any AudioParam properties it has. Outputs of nodes can be clicked on and then connected to inputs and AudioParams of other nodes. To make a full synthesizer I need to build additional pseudo-nodes to represent things like envelope generators and other control signals - equivalents to CV and gate, as it were. I will also need to find ways to describe the controls to existing nodes that aren't managed by AudioParams, such as the impulse response for a ConvolverNode, or the filter mode for a BiquadFilterNode. In any event, I think I can see some things that would be useful, but that don't appear to be available in the current API: 1. AudioParam variables There's no way I can see to have an AudioParam property in a JavascriptAudioNode and then have that node sample the parameter. The JS node is very useful, but without access to AudioParam features it's kind of a second class citizen. For example I might want to implement my ADSR EG using a JS AudioNode. In "real" synthesizers it's common to have the ability to tweak the ADSR envelope based on input frequency. I don't think I can do that with the current API. I can't even implement a custom JS oscillator that has the same external interface as the standard Oscillator without access to AudioParams. 2. More fine-grained "disconnect" With the current API and my framework I can connect specific outputs to specific inputs, but the API only allows for disconnecting _every_ link from a specific output. This feels like a significant limitation. 3. Interrogation of the node graph It would be really handy to be able to ask the AudioContext (or individual nodes) for their list of connections. I'm not sure how well that sits with "one shot" generator nodes seeing as they automatically disconnect themselves, though... I'd appreciate your thoughts on these, and apologise if they've already been covered or otherwise addressed in the mailing lists. cheers, Ray Bellis
Received on Tuesday, 12 June 2012 16:51:16 UTC