Re: Proposal: JavaScriptCueNode

I think this works fine as long as all of the audio sources have predictable
lengths. On Xbox we kept running into problems where the length of the audio
wasn't necessarily known when playback started, either because it was VBR
compressed and didn't have an accessible index, or because something
requested a stop before playback was finished. In both cases users requested
an event that they could hook to respond to the end of playback, or because
the source was long enough that clock drift came into play.

Another common request was to trigger events based on metadata in the stream
itself, for instance RIFF files with transition points for interactive
music.

I don't want to go crazy with extra features, but allowing nodes to trigger
events does seem like a useful thing to do.
Ian

On Thu, Oct 7, 2010 at 8:54 PM, Chris Rogers <crogers@google.com> wrote:

>
>
> On Wed, Oct 6, 2010 at 10:52 AM, Joseph Berkovitz <joe@noteflight.com>wrote:
>
>> Hi audio incubator,
>>
>> This is a simple and powerful idea that was suggested to me by audio tools
>> developer Joa Ebert of Hobnox last year (no connection to this group's work)
>> and I want to share it here. I think it could have a lot of value and seems
>> relatively easy to implement.
>>
>> *JavaScriptCueNode (new feature)*
>> *
>> *
>> It is very useful to be able to have an audio scheduler cue the invocation
>> of some arbitrary code at a specific point on the audio timeline, with no
>> accompanying audio output.
>>
>> Use cases:
>>
>> - control some other piece of the application in synchrony with what is
>> happening in the audio at that point.  (Example: make some graphical element
>> highlight in time to a musical beat, as in Drum Machine demo)
>>
>> - generate a new section of audio experience that smoothly extends the
>> current one before it ends (Example: dynamic generation of loops in a gaming
>> experience or as in Drum Machine demo)
>>
>> Such functions are easiest to drive from the audio framework, instead of
>> having the rest of the app try to trigger itself at the exact correct moment
>> or interval.  For example, look at the schedule() function in the Drum
>> Machine demo, and the somewhat tricky time calculations and setTimeout()
>> calls that it has to perform to loop/advance the groove and draw the
>> playhead.  This would be much easier to code if it were called on a rhythmic
>> basis coming out of the audio engine itself.
>>
>> Such a node can be simply realized in the API as something similar to a
>> JavaScriptAudioNode but with zero outputs, and which exposes a noteOn()
>> function to schedule its occurrence.  Something like:
>>
>> interface JavaScriptCueNode : AudioNode {
>>    attribute EventListener onnoteprocess;
>>    void noteOn(in float when);
>> }
>>
>>
> Hi Joe, this sounds like a useful idea.  But, I think that I wouldn't
> implement it as an AudioNode since it's not processing audio in any way, but
> instead as a method on AudioContext, something like:
>
> context.scheduleTimer(time, callbackFunction);
>
> The "time" parameter would be on the same timescale as the "currentTime"
> attribute of the context.  Special care would need to be taken to ensure
> that excessively large numbers of event listeners don't get fired.  Some
> kind of throttling mechanism would need to be implemented.  This all has to
> be balanced with the throttling mechanism for others timer such as
> setTimeout().
>
> Chris
>
>
>
>


-- 
Ian Ni-Lewis
Developer Advocate
Google Game Developer Relations

Received on Thursday, 7 October 2010 22:03:01 UTC