- From: Karl Tomlinson <karlt+public-audio@karlt.net>
- Date: Tue, 26 Nov 2013 22:07:44 +1300
- To: public-audio@w3.org
Robert O'Callahan writes: > What Karl needs is a way to sync > parameter changes with the point in time when the audio engine actually > starts mixing an AudioBufferSourceNode into its output. The ability to use AudioNode output in AudioParams comes in handy here because an AudioBufferSourceNode for the AudioParam can be started immediately to provide minimum latency effects. This works OK for short duration effects. Perhaps, for changes in the effect after the initial phase, the precise offset may usually not be so important.
Received on Tuesday, 26 November 2013 09:08:27 UTC