Re: Starting

Hi Ehsan,

Please take a look at my response and pseudocode below regarding this point...

> "The time when the audio will be played in the same time coordinate system as AudioContext.currentTime. playbackTime allows for very tight synchronization between processing directly in JavaScript with the other events in the context's rendering graph."
> 
> I believe that this leaves no room for playbackTime to be inaccurate. The value of playbackTime in an AudioProcessEvent must exactly equal the time T at which a sound scheduled with node.start(T) would be played simultaneously with the first frame of the AudioProcessEvent's sample block.
> 
> I have not yet experimented with playbackTime in Gecko yet, but I originally proposed the feature for inclusion in the spec and the above definition is how it needs to work if it's to be useful for synchronization.
> 
> You're right about the current text in the spec, but we should probably change it since what you're asking for is pretty much impossible to implement.  Imagine this scenario: let's say that the ScriptProcessorNode wants to dispatch an event with a properly calculated playbackTime.  Let's say that the event handler looks like this:
> 
> function handleEvent(event) {
>   // assume that AudioContext.currentTime can change its value without hitting the event loop
>   while (event.playbackTime < event.target.context.currentTime);
> }
> 
> Such an event handler would just wait until playbackTime is passed, and then return, and therefore it would make it impossible for the ScriptProcessorNode to operate without latency.

That is not the way that one would make use of event.playbackTime in a ScriptProcessorNode. As you say, looping inside an event handler like this makes no sense and will wreck the operation of the system.

The sole purpose of event.playbackTime is to let the code inside the event handler know at what time the samples that it generates will be played.  Not only is this not impossible to implement, it's quite practical, since it's what any "schedulable" source like Oscillators and AudioBufferSourceNodes must do under the hood.

Here's how it's intended to be used: Going back to pseudocode, let's say you want to start both an Oscillator and some noise starting at some time T… in mono...

var oscillator = context.createOscillator();
// ...also configure the oscillator...
oscillator.connect(context.destination);
oscillator.start(T);

var processor = context.createScriptProcessor();
processor.connect(context.destination);
processor.onprocessaudio = function(event) {
  for (var i = 0..processor.bufferSize) {
    var sampleTime = event.playbackTime + (i * event.outputBuffer.sampleRate);
    if (sampleTime >= T)
        event.outputBuffer.getChannelData(0)[i] = Math.random();
    else
        event.outputBuffer.getChannelData(0)[i]  = 0;
  }
}

There is in fact no other reliable mechanism in the API for script nodes to synchronize their output with "schedulable" sources, which is why this got into the spec in the first place.

…Joe

Received on Tuesday, 23 April 2013 21:42:31 UTC