- From: Chris Rogers <crogers@google.com>
- Date: Thu, 9 Jun 2011 11:46:59 -0700
- To: Jussi Kalliokoski <jussi.kalliokoski@gmail.com>
- Cc: public-audio@w3.org
- Message-ID: <BANLkTikwdG1DC8nwc5EMpurxPuXUm7dhGg@mail.gmail.com>
On Thu, Jun 9, 2011 at 12:15 AM, Jussi Kalliokoski < jussi.kalliokoski@gmail.com> wrote: > > > On Tue, Jun 7, 2011 at 8:27 PM, Chris Rogers <crogers@google.com> wrote: > >> >> >> On Mon, Jun 6, 2011 at 1:48 PM, Jussi Kalliokoski < >> jussi.kalliokoski@gmail.com> wrote: >> >>> Yeah, that being the main case and also that of abstraction layers, such >>> as if someone wrote, for instance something called a MixerNode, that uses >>> gain nodes to take multiple inputs and such, maybe AUXes or whatever. Also, >>> now that I think of it, it would be pretty handy if AudioParam was an >>> instance of EventTarget, so you could add listener hooks to such things as >>> onChange or onAutomationEnd (I don't know about this one), don't you think? >>> It might complicate the thing a bit, but at the benefit of hackability, and >>> that's always a good thing if you want to attract developers. >> >> >> Hi Jussi, with automation the AudioParam value can change for every single >> sample-frame, so I think it would be pretty dramatic to have an >> EventListener get called thousands of times a second! Instead, I think it >> should be possible to query the AudioParam for its value at any exact time. >> > > You're right. However this makes me think of interaction with UI elements > such as knobs, could there be some call like AudioParam.scheduleValue > (that's a stupid name for it, but...) that would define the value of that > AudioParam relative to the time of calling, so that you could get a smooth > change in the value instead of change on every buffer fill callback? > Yes, very recently I've added some sample-accurate high-precision AudioParam scheduling. Earlier last year we discussed the need for parameter automation: http://lists.w3.org/Archives/Public/public-xg-audio/2010Oct/0017.html and I also have a very small section in my specification with a note that more detail is needed: http://chromium.googlecode.com/svn/trunk/samples/audio/specification/specification.html#EventScheduling-section I've been working on this and here's the AudioParam IDL with the new scheduling methods (implementation is now available in canaries): module webaudio { interface [ Conditional=WEB_AUDIO ] AudioParam { attribute float value; readonly attribute float minValue; readonly attribute float maxValue; readonly attribute float defaultValue; readonly attribute DOMString name; // FIXME: Could define units constants here (seconds, decibels, cents, etc.)... readonly attribute unsigned short units; // Parameter automation. void setValueAtTime(in float value, in float time); void linearRampToValueAtTime(in float value, in float time); void exponentialRampToValueAtTime(in float value, in float time); // Exponentially approach the target value with a rate having the given time constant. void setTargetValueAtTime(in float targetValue, in float time, in float timeConstant); // Sets an array of arbitrary parameter values starting at time for the given duration. // The number of values will be scaled to fit into the desired duration. void setValueCurveAtTime(in Float32Array values, in float time, in float duration); // Cancels all scheduled parameter changes with times greater than or equal to startTime. void cancelScheduledValues(in float startTime); }; } These methods allow many possibilities such as ordinary parameter automation as found in a DAW, ADSR (and more complex) envelopes, grain envelopes, LFOs, fade-in/out curves for musical segue, amplitude modulation, etc. Soon, I hope to add these into the specification proposal along with some detailed discussion with diagrams, etc. I've already had a little fun writing some little test cases and hope to write some demos which I can post. Cheers, Chris
Received on Thursday, 9 June 2011 18:47:24 UTC