Re: Node Parameter Control using Curves

On Mon, Oct 4, 2010 at 10:59 AM, Alistair MacDonald <al@bocoup.com> wrote:

> Good afternoon group,
>
> One thing we discussed in the Telecon today was the automation/control of
> Audio Node parameters from JavaScript using curves, where the processing is
> being performed by the compiled C code.
>
> It seems like this is something not yet covered in the WebKit Audio API and
> Joe Berkovitz brought up some interesting points from his experience working
> on the StandingWave audio engine for Flash.
>
> I just wanted to bring the subject up in the mailing list and perhaps spark
> some discussion on what form this feature might take in an API.
>
> Eg:
>
> What syntax could be used to define and apply curves from JavaScript?
>

Without going into the details of how to actually specify the curve (linear,
log, quad, etc.) I'd like to suggest a very high-level view of how the API
could look.  I know that some of what I'm describing is probably obvious,
but it would be good to try to agree on the basic idea before going into too
many details:

For the sake of argument, let's consider an object of type "AudioCurve".  It
would represent a mapping from time to floating-point value.  The curve
would start at time zero and would have a specific duration (in seconds).  A
common use for an AudioCurve would be to attach it to a particular parameter
of an AudioNode.  It would be important to specify at what exact time the
parameter should start interpreting its values from this curve.  Here's a
small example for doing a volume fade-in on an audio source of some talking
voices:


var context = new AudioContext();

var source = context.createBufferSource();
source.buffer = ambientTalkingVoices;

// Create a gain node to control volume
var gainNode = context.createGainNode();

// Connect the graph
source.connect(gainNode);
gainNode.connect(context.destination);

// create a new curve
var fadeInCurve = new AudioCurve(); // maybe this should be
context.createAudioCurve() instead?
fadeInCurve.duration = 10.0; // make it ten seconds long

// Set the fadeInCurve's values to something meaningful for a fade-in curve
(code left out since details of API are not yet specified).
// ...

// Play the ambient talking voices in one second from now, with a 10 second
fade-in
var now = context.currentTime;
gainNode.gain.scheduleAutomation(fadeInCurve, now + 1.0);
source.noteOn(now + 1.0);


With similar code, it would be possible to schedule a filter sweep to happen
at a particular time.  These curves could be edited in real-time with an
appropriate graphical UI (canvas, SVG, WebGL...)

In this example, I've defined a "scheduleAutomation()" method on AudioParam.
 If somebody called this method more than once on the same overlapping time
segment, then I would expect the last call to supercede any previous curves
for that segment.

I'm imagining that AudioCurve would be used for generic curves such as
fade-ins, filter and pitch sweeps, etc.  These would be the types of curves
that people typically edit in a DAW application in the timeline view.  Maybe
AudioCurve could also be used for envelopes and LFOs, or maybe it would make
more sense to have a separate AudioEnvelope type, etc.

Any thoughts?


>
> What kinds of curves are needed? (Parametric, linear, quad, etc)
>
> Can the user supply their own curve algorithm in the same way users can add
> easing modes to graphics animations?
>
> Can curves values/types be changed real-time etc?
>
> Can curves be quantized to a BPM clock?
>
> Can curves drive frequency parameters for synthesis?
>
>
>
> Al MacDonald
>
>

Received on Monday, 4 October 2010 19:09:23 UTC