Re: Web Audio thoughts

On Tue, Jun 12, 2012 at 11:40 AM, Ray Bellis <ray@bellis.me.uk> wrote:

> On 12/06/2012 18:49, Chris Rogers wrote:
>
>  Hi Ray, that sounds very cool!
>>
>
> You should try it - for the amount of code needed so far I think it is
> kinda cool :)
>
>
>     1.  AudioParam variables
>>
>>    There's no way I can see to have an AudioParam property in a
>>    JavascriptAudioNode and then have that node sample the parameter.  The
>>    JS node is very useful, but without access to AudioParam features it's
>>    kind of a second class citizen.
>>
>> Yes, Jussi has brought this up before, and we agreed it could be useful.
>> But for now at least, we're not adding this part in.  We could add this
>> in the future...
>>
>
> That would be good.  I understand the discussion about threads, and
> blocking, and all that, but a JSAudioNode has all those problems anyway.
>
>
>     For example I might want to implement my ADSR EG using a JS AudioNode.
>>    In "real" synthesizers it's common to have the ability to tweak the
>> ADSR
>>    envelope based on input frequency.  I don't think I can do that with
>> the
>>    current API.
>>
>> In general, AudioParams can handle this type of thing.  If I can find
>> the time, I'd like to create some more examples using envelopes to show
>> some of these things.
>>
>
> Sure, I know I can use AudioParams to _create_ an EG.  I also want to use
> them to _tune_ the EG ;)
>
>
>  By the way, one thing which is now possible is to feed the output from a
>> JavaScriptAudioNode into a parameter, thus controlling the parameter
>> with an audio-rate signal generated in JS.  This is kind of the opposite
>> of what you're describing, but is useful.
>>
>
> Yes, exactly.
>
>  >   2.  More fine-grained "disconnect"
>>
> >
>
>> Yes, we've talked a little about this before.  It'll be something we
>> should address.
>>
>
> Great :)
>
>     3.  Interrogation of the node graph
>>
>> I think we've discussed this before.  Although we could add such an API,
>> it's not that hard for JS wrapper code to keep track of these connections.
>>
>
> Indeed, it's not _that_ hard.
>
> However for my example I _do_ need the .numberOfInputs and
> .numberOfOutputs properties.
>
> I understand having read some of the list archive that discussion of node
> graph properties has been linked with _removing_ those properties but I
> would lean way in the other direction.
>
> FWIW, some more detailed documentation on the units used by AudioParams
> would be _really_ useful.  For example, I've actually no idea what would
> happen if I mixed the output from an LFO with the frequency input of an
> Oscillator.  The LFO has a nominal output range of -1 to +1, so what would
> happen to the frequency?
>

I've tried my best to document the units correctly for all of the
AudioParams.  For example, the .frequency attribute is in Hertz, this would
correspond to -1 to +1 deviation in Hertz.  So you'd probably want to scale
this to a somewhat larger range by first amplifying it with an
AudioGainNode before connecting it to .frequency

You can check out my simple FM demo here which does this kind of thing:
http://chromium.googlecode.com/svn/trunk/samples/audio/oscillator-fm2.html

Cheers,
Chris


>
> cheers,
>
> Ray
>

Received on Tuesday, 12 June 2012 19:00:35 UTC