Re: Web Audio API questions and comments

Den 2012-06-19 19:01:44 skrev Joe Turner <joe@oampo.co.uk>:

> Hey Chris,
> Thanks for the reply,
>
> On Tue, Jun 19, 2012 at 5:34 PM, Chris Wilson <cwilso@google.com> wrote:
>>
>> On Tue, Jun 19, 2012 at 7:39 AM, Joe Turner <joe@oampo.co.uk> wrote:
>>>
>>>
>>> - RealTimeAnalyserNode
>>>
>>> This seems strange to me - the functionality could be really useful,
>>> but it seems focused very narrowly on creating visualisations.  I
>>> think a nicer solution would be to have separate FFT and IFFT nodes so
>>> frequency domain effects could be integrated into the processing
>>> chain, and then a separate node which allows access to the FFT or
>>> waveform data depending on where in the graph it is inserted.  So for
>>> visualisations you would have an AudioNode connected to an FFTNode,
>>> connected to a BufferYoinkerNode.
>>
>>
>> This is actually about what realtimeAnalyser is.  What scenario are you
>> trying to do, exactly?
>>
>
> Yes, I was more interested in what this would allow on the audio side
> apart from data visualisation.  For example:
>
> AudioNode -> FFTNode -> InterestingFrequencyDomainEffectNode ->
> IFFTNode -> AudioDestinationNode

I still believe that the best solution here would be to provide a  
lightning fast FFT function/object (independently from the Web Audio API)  
that can be called from JavaScript. That way you can implement a wide  
range of interesting FFT-based effects in a JavaScriptAudioNode. It would  
also make the FFT part of the RealTimeAnalyserNode unnecessary (just  
provide the time domain signal, and the programmer can call the FFT  
function from JS using any windowing function he wishes).

See https://www.w3.org/Bugs/Public/show_bug.cgi?id=17361 for some  
suggestions for the RealTimeAnalyzerNode.


Regards,

   Marcus


-- 
Marcus Geelnard
Core Graphics Developer
Opera Software ASA

Received on Wednesday, 20 June 2012 06:23:15 UTC