Re: Sync of scriptProcessorNode and native Node

Hello,

I'm aware of public-audio list conversations about the use of workers for the scriptProcessorNode and I'm very excited about the possibilities of this solution, but I supposed that it was possible to sync a scriptProcessorNode and a native Node with the current implementation. Am I wrong? And if not, how it is possible to achieve?

Thank you,

Arnau

On 5 mai 2014, at 18:42, Chris Wilson <cwilso@google.com> wrote:

> Lonce,
> 
> this is one of the biggest and most important issues on my Web Audio plate right now.  I'm working on figuring out how to spark some coming together of implementers over the summer to come up with a workable solution.
> 
> 
> On Fri, May 2, 2014 at 9:38 PM, lonce <lonce.audio@sonic.zwhome.org> wrote:
> 
> Hi -
> 
>     I think the real question is not how to hack this, but the status of progress on a fundamental solution to this  Achilles heal of the current system. From what I gather, the solution will probably be in the form of web workers (?), but I don't know how much attention this is getting now.
>     Once this is solved, the system becomes truly extensible and I am sure it will open up an explosive era of community development just waiting to happen!
> 
> Best,
>              - lonce
> 
> 
> On 5/2/2014 4:39 PM, Arnau Julia wrote:
> Hello,
> 
> First of all, thank for all your answers.
> 
> The first thing to note is that all script processor node processing happens on the main javascript thread.  This means if you change a
> global variable in another part of your javascript program, it will show definitely show up on the next AudioProcessingEvent.  So, that
> answers your first problem - once you set the variable in your javascript, on the next buffer the change will be there.  There's no
> parallelism at all in the javascript - there's only one thing happening at once.
> I would like to understand how it works. The difference that I found between the scriptProcessorNode and the 'native' AudioNode Interface is that the first uses a Event Handler and the AudioNodes are EventTargets. Is it the reason why the global variables are updated only one time for each buffer? Someone have more documentation to understand it more deeply?
> 
> For your second question, you need some sort of timestamp on the buffer.  The web audio api provides this as the playbackTime field on
> the AudioProcessingEvent.  Of course, you only have access to the playback time of the buffer you are currently processing, but you can
> guess when the next playbackTime will be by setting the last processed time as a global variable, and then adding one buffer's worth of time
> to that to get the next playbackTime.  This will be fine unless you drop buffers, in which case you're probably not worried about a smooth
> ramp :-).  So, one easy solution to your second problem is to always store the last playback time that each of your script nodes processed,
> and then start the ramp on the *next* buffer.  The spec guarantees that the playbackTime and ramping is sample accurate, so no worries
> there.  In practice, the last time I checked, which was over a year ago, firefox had serious problems with the playbackTime field (I don't
> remember if it was just absent or if it had some other problem that made it unusable.)
> It seems a good solution! I didn't found the playbackTime on the last stable version of Chrome but I found it in Firefox. Is there any alternative for Chrome?
> 
> I have done some basic experiments with playbackTime in Firefox and it seems that is not totally sync or maybe I don't understand how to use it. I uploaded the experiment to jsfiddle (only Firefox!):  http://jsfiddle.net/PgeLv/11/
> The experiment structure is:
> oscillatorNode (source) ----> scriptProcesorNode -----> GainNode -------> Destination
> 
> On the other hand, I would like to understand 'what' is exactly the playbackTime. I guess that it can be something like that:
> 
> playbackTime = bufferSize/sampleRate + 'processTime' + 'wait interval until the event return the data to the audio thread'
> 
> If this hypothesis is true, it means that the playbackTime is different for each event, because it depends on the activity of the general thread.
> 
> Thanks,
> 
> Arnau
> 
> On 22 avr. 2014, at 01:51, Russell McClellan <russell.mcclellan@gmail.com> wrote:
> 
> Hey Arnau -
> 
> Yes, this is probably underdocumented.  The good news is, the
> designers of the web audio api do actually have an answer for linking
> native nodes and script processor nodes.
> 
> The first thing to note is that all script processor node processing
> happens on the main javascript thread.  This means if you change a
> global variable in another part of your javascript program, it will
> show definitely show up on the next AudioProcessingEvent.  So, that
> answers your first problem - once you set the variable in your
> javascript, on the next buffer the change will be there.  There's no
> parallelism at all in the javascript - there's only one thing
> happening at once.
> 
> For your second question, you need some sort of timestamp on the
> buffer.  The web audio api provides this as the playbackTime field on
> the AudioProcessingEvent.  Of course, you only have access to the
> playback time of the buffer you are currently processing, but you can
> guess when the next playbackTime will be by setting the last processed
> time as a global variable, and then adding one buffer's worth of time
> to that to get the next playbackTime.  This will be fine unless you
> drop buffers, in which case you're probably not worried about a smooth
> ramp :-).  So, one easy solution to your second problem is to always
> store the last playback time that each of your script nodes processed,
> and then start the ramp on the *next* buffer.  The spec guarantees
> that the playbackTime and ramping is sample accurate, so no worries
> there.  In practice, the last time I checked, which was over a year
> ago, firefox had serious problems with the playbackTime field (I don't
> remember if it was just absent or if it had some other problem that
> made it unusable.)
> 
> Thanks,
> -Russell
> 
> On Fri, Apr 18, 2014 at 10:50 AM, Casper Schipper
> <casper.schipper@monotonestudio.nl> wrote:
> Dear Arnau,
> 
> this is indeed a frustrating (but probably performance wise necessary)
> limitation of the normal web audio nodes,
> parameters in a scriptProcessorNode can only be updated once every vector
> which is a minimum of 256 samples.
> 
> Maybe you could solve your problem by using one of the javascript libraries
> that  bypass most of web audio api and do everything in JS itself.
> What comes first to mind would be the Gibberish.js library by Charlie
> Roberts, which gives you the ability to control parameters per sample and
> easily schedule synchronized parameter changes also with sample accuracy:
> http://www.charlie-roberts.com/gibberish/docs.html
> It should be quite easy to extend it with your own nodes.
> There are other libraries as well like flocking.js and Timbre.js.
> 
> Of course it comes with some performance penalties, but Gibberish tries to
> at least generate javascript code that should be as efficient as possible
> for it's JIT complication style, as far as it's own nodes are considered.
> 
> Hope it helps,
> Casper
> 
> casper.schipper@monotonestudio.nl
> Mauritskade 55C (the thinking hut)
> 1092 AD  Amsterdam
> +316 52 322 590
> 
> 
> 
> 
> 
> 
> 
> 
> On 18 apr. 2014, at 10:55, Arnau Julia <Arnau.Julia@ircam.fr> wrote:
> 
> Hello,
> 
> I'm trying to synchronizing the buffer in a scriptProcessorNode with
> native/regular web audio nodes and I'm having some problems. My problem is
> that I want to synchronize the scriptProcessorNode with a ramp of a
> GainNode.
> 
> My program looks like the attached diagram. Each scriptProcessorNode is a
> filter with n coefficients and these coefficients are in a global variable.
> My problem comes when I try to update these coefficients and do a ramp in
> the gain through an audioParam at the "same time".
> 
> The start scenario is (in pseudo-code):
> 
> audioBufferSourceNode.connect(scriptProcessorNode0);
> audioBufferSourceNode.connect(scriptProcessorNode1);
> 
> scriptProcessorNode0.connect(gainNode0);
> scriptProcessorNode0.connect(gainNode1);
> 
> gainNode0.connect(audioContext.destination);
> gainNode1.connect(audioContext.destination);
> 
> gainNode1.gain.value = 0;
> globalVariableOfCoefficients0 = coefficients0;
> globalVariableOfCoefficients1 = null;
> 
> audioBufferSourceNode.start(0);
> 
> The reason to have two scriptProcessorNodes is because I want to do a smooth
> transition of the coefficients, so I do a crossfading between the 'old'
> coefficients (scriptProcessorNode0) and the 'new' coefficients
> (scriptProcessorNode1) with the ramps of gainNode0 and gainNode1. So when I
> receive the notification to update the coefficients, the global variable is
> updated and the ramps are started.
> The first problem is that when I change the globalVariableOfCoefficients1, I
> don't know if the value of the variable is really updated in the
> scriptProcessorNode. It seems that the scriptProcessorNode have to wait
> until get a new buffer to update the value of their global variables . On
> the other hand, there a second problem. If I change the value of the
> globalVariableOfCoefficients1 and I wait to get a new buffer for update
> their global variables, how I can know when the first sample of this new
> buffer "is" really in the gainNode?
> 
> On the other hand, I would like to find some documentation where the
> relation between the scriptProcessorNode and the audio thread  is explained
> for clearly understand the problematic.
> 
> Thank you very much in advance,
> 
> Arnau Julià
> 
> 
> <diagram_webAudio.png>
> 
> 
> 
> 
> 
> 
> 
> -- 
> Lonce Wyse Dept. of Communications and New Media National University of Singapore
> 
> 

Received on Tuesday, 6 May 2014 08:13:59 UTC