Re: [web-audio-api] OfflineAudioContext and ScriptProcessorNodes (#69)

> [Original comment](https://www.w3.org/Bugs/Public/show_bug.cgi?id=22723#21) by Chris Rogers on W3C Bugzilla. Tue, 23 Jul 2013 23:29:13 GMT

(In reply to [comment #20](#issuecomment-24244264))
> (In reply to [comment #19](#issuecomment-24244254))
> > I agree that we shouldn't force it to run at 128.  But I think we should
> > change the current spec to allow for size 128, especially for the
> > OfflineAudioContext case.  Right now, the minimum size is 256, and we would
> > like to get zero latency for this offline case.
> 
> Hmm, I know that roc really wanted the buffer size to be a choice that the
> UA makes not one that the author makes...
> 
> But that aside, I still don't correctly understand why we have to fix the
> latency issue in the offline processing case by allowing a buffer size of
> 128.  Let's say that on a given path from a source node in the graph to the
> destination node, we have N ScriptProcessorNodes, all with a buffer size of
> M (for simplicity's sake.)  In this sitaution, these nodes are creating a
> latency of N*M frames.

If we use a buffer size of 128, then the total latency would still be zero even with N ScriptProcessorNodes.

I could actually create a test case which shows this.

  Can't we address this issue by disregarding the
> first N*M frames that the destination node observes?  (I'm not 100% sure if
> this solution works, but I can't completely convince myself either way.)
> 
> > I've created an early prototype in Chrome which synchronizes the audio
> > thread with the main thread, and runs at 128.  I've found that the average
> > time between 
> > onaudioprocess callbacks is around 50microseconds.  I tried a really simple
> > test case:
> > 
> > AudioBufferSourceNode -> ScriptProcessorNode ->
> > OfflineAudioContext.destination
> > 
> > and processed during a time period of several minutes long.  On a mid-range
> > Mac pro I saw around 60x real-time performance.
> 
> Have you performed measurements on whether (and how much) that affects the
> responsiveness of the main thread?  I'm worried that if the audioprocess
> event takes let's say 5ms to process on average, this may degrade the
> performance of the page if it uses requestAnimationFrame to render an
> animation, for example.

Yes, I too was concerned about that possibly interfering with the main thread.  But the way I've implemented it, the synchronization happens one at a time, and do not "pile up".  I just created a test using requestAnimationFrame() to draw smoothly at (hopefully) 60fps, while the OfflineAudioContext is running and calling back frequently to the main thread.  It seems to draw just fine, and I anticipate it would handle user events without a hitch, but haven't yet tested that...

---
Reply to this email directly or view it on GitHub:
https://github.com/WebAudio/web-audio-api/issues/69#issuecomment-24244276

Received on Wednesday, 11 September 2013 14:30:33 UTC