Re: Node Parameter Control using Curves

Chris,

I think that the overall approach you outline here does make sense.   
There are a bunch of detailed issues regarding types of curves, etc.  
which I'll respond to elsewhere.

One issue here is what the timing attaches to:

Is it (as you proposed):

     gainNode.gain.scheduleAutomation(fadeInCurve, now + 1.0);
     source.noteOn(now + 1.0);

or:

     gainNode.gain.scheduleAutomation(fadeInCurve);
     fadeInCurve.noteOn(now+1.0);
     source.noteOn(now + 1.0);

The latter approach feels like a more consistent treatment of audio  
and modulation sources in some ways.  And that has advantages of its  
own -- any audio node could act as a modulator/automator.  Perhaps  
this would allow the use of a JavaScriptAudioNode as the source of a  
modulation/automation signal.

But the correct answer might be "neither of the above".  I will be  
writing up some thoughts on time sequencing issues shortly, because  
I'm a little troubled by the idea of noteOn() as the only way that  
things get choreographed in the API.  I also love noteOn() at the same  
time.  Anyway, I'll explain elsewhere...

Also, terminology.  To be a bit fussy, I favor calling this sort of  
thing "modulation" rather than "automation" because it is more  
general.  Automation relates to a mixdown type of usage like moving a  
fader, but modulation is more generally a signal that affects some  
parameter value in real time.  Music synthesis tends to be more about  
modulation than automation -- vibrato, crescendo, guitar note bends,  
etc.  These act within a single musical event (often a single note),  
rather than across a bunch or a stream of events.


On Oct 4, 2010, at 3:08 PM, Chris Rogers wrote:

>
>
> On Mon, Oct 4, 2010 at 10:59 AM, Alistair MacDonald <al@bocoup.com>  
> wrote:
> Good afternoon group,
>
> One thing we discussed in the Telecon today was the automation/ 
> control of Audio Node parameters from JavaScript using curves, where  
> the processing is being performed by the compiled C code.
>
> It seems like this is something not yet covered in the WebKit Audio  
> API and Joe Berkovitz brought up some interesting points from his  
> experience working on the StandingWave audio engine for Flash.
>
> I just wanted to bring the subject up in the mailing list and  
> perhaps spark some discussion on what form this feature might take  
> in an API.
>
> Eg:
>
> What syntax could be used to define and apply curves from JavaScript?
>
> Without going into the details of how to actually specify the curve  
> (linear, log, quad, etc.) I'd like to suggest a very high-level view  
> of how the API could look.  I know that some of what I'm describing  
> is probably obvious, but it would be good to try to agree on the  
> basic idea before going into too many details:
>
> For the sake of argument, let's consider an object of type  
> "AudioCurve".  It would represent a mapping from time to floating- 
> point value.  The curve would start at time zero and would have a  
> specific duration (in seconds).  A common use for an AudioCurve  
> would be to attach it to a particular parameter of an AudioNode.  It  
> would be important to specify at what exact time the parameter  
> should start interpreting its values from this curve.  Here's a  
> small example for doing a volume fade-in on an audio source of some  
> talking voices:
>
>
> var context = new AudioContext();
>
> var source = context.createBufferSource();
> source.buffer = ambientTalkingVoices;
>
> // Create a gain node to control volume
> var gainNode = context.createGainNode();
>
> // Connect the graph
> source.connect(gainNode);
> gainNode.connect(context.destination);
>
> // create a new curve
> var fadeInCurve = new AudioCurve(); // maybe this should be  
> context.createAudioCurve() instead?
> fadeInCurve.duration = 10.0; // make it ten seconds long
>
> // Set the fadeInCurve's values to something meaningful for a fade- 
> in curve (code left out since details of API are not yet specified).
> // ...
>
> // Play the ambient talking voices in one second from now, with a 10  
> second fade-in
> var now = context.currentTime;
> gainNode.gain.scheduleAutomation(fadeInCurve, now + 1.0);
> source.noteOn(now + 1.0);
>
>
> With similar code, it would be possible to schedule a filter sweep  
> to happen at a particular time.  These curves could be edited in  
> real-time with an appropriate graphical UI (canvas, SVG, WebGL...)
>
> In this example, I've defined a "scheduleAutomation()" method on  
> AudioParam.  If somebody called this method more than once on the  
> same overlapping time segment, then I would expect the last call to  
> supercede any previous curves for that segment.
>
> I'm imagining that AudioCurve would be used for generic curves such  
> as fade-ins, filter and pitch sweeps, etc.  These would be the types  
> of curves that people typically edit in a DAW application in the  
> timeline view.  Maybe AudioCurve could also be used for envelopes  
> and LFOs, or maybe it would make more sense to have a separate  
> AudioEnvelope type, etc.
>
> Any thoughts?
>
>
> What kinds of curves are needed? (Parametric, linear, quad, etc)
>
> Can the user supply their own curve algorithm in the same way users  
> can add easing modes to graphics animations?
>
> Can curves values/types be changed real-time etc?
>
> Can curves be quantized to a BPM clock?
>
> Can curves drive frequency parameters for synthesis?
>
>
>
> Al MacDonald
>
>

... .  .    .       Joe

Joe Berkovitz
President
Noteflight LLC
160 Sidney St
Cambridge, MA 02139
phone: +1 978 314 6271

Received on Monday, 4 October 2010 21:07:45 UTC