- From: lonce <lonce.wyse@zwhome.org>
- Date: Fri, 27 Sep 2013 16:58:01 +0200
- To: public-audio@w3.org
Yes! The world needs phases, too! +1 - lonce On 2013-09-27 11:22 AM, Chris Lowis wrote: > Hi, > > I recently demoed and talked about the Web Audio API at an event for > audio engineers and academics. A few of them had played with the API. > One thing that came up a couple of times was that the AnalyzerNode > only provides an accessor the magnitude of the frequency spectrum. The > use cases they were suggestion went beyond visualisation of data > (ambisonics and beat detection were mentioned), and I think this would > be a really useful change. > > It is currently under-specified exactly what getFloatFrequencyData > should do, but I think the Webkit implementation returns an array of > floating point values corresponding to the magnitude of the calculated > spectrum. > > There's a few options perhaps: > > i) Allow getFloatFrequencyData() to return an array of complex values somehow. > ii) Add methods for getRealFrequencyData() and getImaginaryFrequencyData() > iii) Add getFloatPhaseData() which would return the phase of the > calculated spectrum, and leave getFloatFrequencyData() returning the > magnitude. > > I can create an issue against the spec for this if we agree it's a > change worth making. I think option iii is pretty good as it won't > break existing code, but option i is probably neater. > > What do you think? > > Cheers, > > Chris > >
Received on Friday, 27 September 2013 14:58:22 UTC