Oscillator.setWaveTable() vs Oscillator.setPeriodicWave()

Hi,

I'm a 'user' of the WebAudio API, not a heavy user currently, but more of a
tinkerer with a background in audio technology.

I'd previously managed to re-create an experiment from many years ago where
I was able to create my own wave-table based synth, but when I revisited
this experiment recently, I noticed that the setWaveTable function had been
deprecated in favour of setPeriodicWave.

I'm emailing to ask for a bit of a background as to why this was done?

To me, as a semi-lay-person with audio, I would best describe a sound as a
set of samples - for example, I might want to make a sampler to trigger
various snippets (can be done with other nodes, I know), or produce a
wavetable that my browser doesn't have the capability to generate because
it comes from lots of effects, etc.

Now, I know it's possible to describe basic / fundamental wave-forms as
polar co-ordinates, but I always struggled with this level of DSP - it was
far easier for me to bang out a for-loop to generate a load of samples,
than to try and describe it mathematically.  I know that I can still do
this if I want to and pass my samples through an FFT to get the
polar-coordinates, but to me that seems a bit backwards, considering it's
going to be passed through another IFFT in order to become samples again.

I feel that I speak for many people who would visualise sound best as a
series of samples over a mathematical function.

I can only assume that one of the reasons may be to prevent 'incorrectly
described' waveforms, i.e. glitchy stuff from getting into the audio chain,
but some of us like that sort of stuff...

I hope you can enlighten me as I'd love to see that function reinstated.

Thanks

Glen Pike

Received on Friday, 4 May 2018 16:21:36 UTC