Re: Aiding early implementations of the web audio API

On Tue, May 22, 2012 at 3:28 PM, Robert O'Callahan <robert@ocallahan.org>wrote:

> On Wed, May 23, 2012 at 9:22 AM, Colin Clark <colinbdclark@gmail.com>wrote:
>
>> I think that's a really good start, yes! The key, as Jussi has just
>> mentioned, is to think through how we might expose the behaviour of the
>> built-in AudioNodes in a manner that authors of JavaScriptAudioNodes can
>> harness. If a native FFT can blow away one implemented in JavaScript (such
>> as the one implemented by Ofm Labs), perhaps it should be exposed in a way
>> that is not dependent on use of the RealtimeAnalyzerNode?
>>
>
> I think exposing an FFT library directly to JS (operating on JS typed
> arrays) is a no-brainer. It should be fairly easy to spec and implement.
>
> Output signals from AudioNodes can be piped into a Javascript processing
> node, giving you some reuse there.
>

Indeed.

One question - "exposing the behaviour of built-in AudioNodes in a manner
that authors of JavaScriptAudioNodes can harness" sounds like subclassing
those nodes to me, which isn't the same thing as providing only lower-level
libraries (like FFT) and asking developers to do the hook-up in JS nodes.
 What's the desire here?  I think Robert and Jussi are suggesting not to
have the native nodes; Colin seems to be saying "just make sure you can
utilize the underlying bits in JSNode".  Is that appropriate?

I'm still coming up to speed on the spec, so I'll continue to mull it over
>> with this in mind. Another thing, off the top of my head, that stands out
>> is the noteOn/noteGrainOn/noteOff methods that some AudioNodes implement.
>> It wasn't clear to me from reading the spec if JavaScriptAudioNodes can
>> also implement this behaviour?
>>
>
> No. Having the ability to schedule the turning on and off of arbitrary
> streams/nodes is one of the features MediaStreams Processing has that Web
> Audio doesn't.
>

That statement (scheduling turning on and off of arbitrary streams/nodes is
in MSP but not WA) is true; however, you COULD implement a
noteOn/noteGrainOn/noteOff on your own JavaScriptAudioNode, could you not?
 If you were trying to implement some scenario that wanted a JS node that
functioned similarly to AudioBufferSourceNode.

noteOn/noteGrainOn/noteOff are not intended, in the WA API, to be a stream
start/pause.  Those are, to me, two separate functions.

-C

Received on Tuesday, 22 May 2012 23:46:57 UTC