- From: Alistair MacDonald <al@signedon.com>
- Date: Tue, 27 Mar 2012 15:31:48 -0400
- To: Patrick Borgeat <patrick@borgeat.de>
- Cc: Chris Rogers <crogers@google.com>, public-audio@w3.org
- Message-ID: <CAJX8r2=0e3dcQ_j1QeWcF4J6r1GBrR7oR05hw3oW6FqY5s-9RA@mail.gmail.com>
OK I see, your buffer is effecting the parameter, not the stream directly, and the interpolation is being done by the engine, not by JavaScript. My hunch is that because you are thinking of using a JavaScript callback anyway, so you may still be bound by some of the same performance issues. But those considerations are a little out of my depth. I know Chris Rogers is very busy right now, but I would be interested his thoughts on this when he becomes available. On Tue, Mar 27, 2012 at 3:17 PM, Patrick Borgeat <patrick@borgeat.de> wrote: > > Am 27.03.2012 um 20:58 schrieb Alistair MacDonald: > > It seems like you are updating a buffer in your callback, but at the same > time you are calling "setValue..." -- thus setting a value to be > interpolated to the audio engine. > > > The buffer I create is intended to control an audio parameter, not the be > played as audio. Ideally a user could specify a buffer size and an > interpolation factor, so the audio engine could interpolate these values > (mimicking control rate signals as used in Csound or SuperCollider) > > for example: > > buffer Size: 8, interpolation factor: 64 > > The callback would occur every 512 (8*64) audio samples, requesting 8 > values from the callback function. The AudioParam now interpolates these > values for every audio sample. > > But this might get too complicated. > > > If you are changing the buffer continuously, I believe the using the > existing JavaScript node could be better suited to your needs in this > use-case. But please correct me if I am missing something key here. > > > I can see how this could be achieved with an JavaScript node, but setting > an audioParam by a probably totally unrelated JavaScript node (that in fact > doesn't produce any audio) doesn't seem very clean to me. > > Directly using an audio signal to control a parameter on the other hand > would be very handy (but would require low-rate signals and automatic > up-sampling/interpolation to work efficient) > > So you would suggest that I attach a JavaScript Node which does something > like … > > unrelatedFilter.frequency.value = Math.sin(…) > > … ? (which probably wouldn't work that way very well, the Node would have > to compute the whole next block of control signals and set it with > setValueCurveAtTime) > > > Maybe I'm too used to flexible Computer Music systems (like SuperCollider) > and expect too much of a general purpose Web Audio API. But the current Web > Audio API would require me to reimplement an audio graph inside a > JavaScript Node for the use case "Artistic Audio Exploration". > > > Patrick > -- Alistair MacDonald SignedOn, Inc - W3C Audio WG Boston, MA, (707) 701-3730 al@signedon.com - http://signedon.com
Received on Tuesday, 27 March 2012 19:32:17 UTC