W3C home > Mailing lists > Public > public-audio@w3.org > July to September 2012

Re: Reflections on writing a sequencer

From: lonce wyse <lonce.wyse@zwhome.org>
Date: Wed, 25 Jul 2012 21:00:25 +0800
Message-ID: <500FEDE9.1090704@zwhome.org>
To: Joe Berkovitz <joe@noteflight.com>
CC: public-audio@w3.org

Hi -
     Yes I realized that as I hit the email send button - however, it 
actually isn't the periodicity of the callback that matters. They could 
be aperiodic and the buffers to fill could be of different length - as 
long as you know what time the sample buffer in the callback represents.

- lonce


On 25/7/2012 8:46 PM, Joe Berkovitz wrote:
>
> One other important point I overlooked: JSAudioNode processing 
> callbacks are not sample accurate in terms of absolute time. They may 
> jitter around since they precede actual sound output by a variable 
> amount depending on the audio pipeline's overall latency at the time. 
> The browser is free to play around with this latency to provide glitch 
> free output.
>
> So it doesn't really provide you with the "rock solid" timing that you 
> might expect.
>
> ...j
>
> On Jul 25, 2012 8:28 AM, "lonce wyse" <lonce.wyse@zwhome.org 
> <mailto:lonce.wyse@zwhome.org>> wrote:
>
>
>     Hi  -
>         Of course, you would want to generate events as short a time
>     in to the future as possible in order to stay responsive to rate
>     (or tempo) changes.
>         Ideally a JavaScriptAudioNode could be used as the event
>     generator. It's onaudioprocess() method could check the length of
>     the output buffer it is passed, and do nothing else but call "note
>     on" events for other nodes it wants to play within that short
>     period of time.
>         I haven't tried that yet, but would noteon events be handled
>     properly when generated in this "just in time" manner? Would this
>     be a violation of protocol to use a onaudioprocess() as what would
>     amount to a rock-solid sample-accurate periodic callback function?
>
>     Best,
>                  - lonce
>
>
>
>     On 25/7/2012 12:40 AM, Joseph Berkovitz wrote:
>>     HI Adam,
>>
>>     I think one general way to structure sequencer playback is as
>>     follows -- I've used this approach with WebAudio successfully in
>>     the past:
>>
>>     1. Just before starting playback, take note of the AudioContext's
>>     currentTime property.  Add a small time offset to it, say 100 ms.
>>      The result will be your performance start time, corresponding to
>>     time offset zero in your sequencer data.  (The time offset
>>     provides a short window in which to schedule the first events in
>>     the sequence).
>>
>>     2. Create a scheduler function that will run periodically, which
>>     examines the AudioContext's currentTime and subtracts the
>>     previously captured startTime. That gives you a "current
>>     performance time" at the moment the callback occurs, expressed in
>>     terms of your sequencer data.  Then create AudioNodes
>>     representing all sequencer events that occur within an arbitrary
>>     time window after this current performance time (say, several
>>     seconds) and schedule them with noteOn/noteOff.
>>
>>     3. Call the function immediately, and also use setInterval() or
>>     setTimeout() to schedule callbacks to the above function on some
>>     reasonable basis, say every 100-200 ms. The exact interval is not
>>     important and can be tuned for best performance.
>>
>>     This approach is relatively insensitive to callback timing and in
>>     general allows audio to be scheduled an arbitrary interval in
>>     advance of its being played.
>>
>>     ...Joe
>>
>>
>>     On Jul 24, 2012, at 11:40 AM, Adam Goode wrote:
>>
>>>     Hi,
>>>
>>>     Yesterday I tried to write an extremely simple sequencer using
>>>     webaudio. My goal was to have a tone play periodically, at a
>>>     user-selectable low frequency interval.
>>>
>>>     The main problem I ran into was the difficulties in scheduling
>>>     events synchronized with the a-rate clock.
>>>
>>>     If I want to play a tone twice per second, I want to call this
>>>     code in a loop, indefinitely:
>>>
>>>     var startTime = ....
>>>     var o = c.createOscillator();
>>>     o.connect(c.destination);
>>>     o.noteOn(startTime);
>>>     o.noteOff(startTime + 0.1);
>>>
>>>     I can't just put it in a loop, I need to schedule this in a
>>>     callback, when necessary to fill the event queue. But what
>>>     callback to use? setInterval is not appropriate, since the
>>>     setInterval clock will skew quickly from c.currentTime. And busy
>>>     looping with setInterval(0) will consume a lot of CPU and gets
>>>     throttled when switching tabs (try putting the drum machine demo
>>>     in a background tab and see).
>>>
>>>     My solution was this:
>>>
>>>     var controlOscillator = c.createOscillator();
>>>     controlOscillator.frequency.value = 2;
>>>     var js = c.createJavaScriptNode(256, 1, 0);
>>>     controlOscillator.connect(js);
>>>
>>>     js.onaudioprocess = function(e) {
>>>       ... detect positive zero crossing from control oscillator ...
>>>       if (zeroCross) {
>>>         var o = c.createOscillator();
>>>         o.connect(c.destination);
>>>         var startTime = ... zero crossing offset + playbackTime ...
>>>         o.noteOn(startTime);
>>>         o.noteOff(startTime + 0.1);
>>>       }
>>>     };
>>>
>>>
>>>     This does work (except for missing playbackTime
>>>     https://bugs.webkit.org/show_bug.cgi?id=61524, needing to
>>>     connect the javascript node to destination, and another bug on
>>>     chrome http://crbug.com/138646), but is awkward. There is also
>>>     the question of having a disconnected graph: I am sending
>>>     control data, not audio data, so I don't want to connect it to
>>>     destination.
>>>
>>>     I essentially want to have a callback for getting new control
>>>     data, to keep the event pipeline filled without overflowing any
>>>     noteOn buffer or falling behind. Is the javascript node
>>>     appropriate for this? I feel like there could be something more
>>>     explicit, like a setInterval off of the audio context.
>>>
>>>
>>>
>>>     Adam
>>>
>>>
>>
>>     ... .  .    .       Joe
>>
>>     *Joe Berkovitz*
>>     President
>>
>>     *Noteflight LLC *
>>     84 Hamilton St, Cambridge, MA 02139
>>     phone: +1 978 314 6271 <tel:978%20314%206271>
>>     www.noteflight.com <http://www.noteflight.com>
>>
>
Received on Wednesday, 25 July 2012 13:01:14 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 25 July 2012 13:01:14 GMT