- From: Chris Rogers <crogers@google.com>
- Date: Thu, 9 May 2013 14:41:47 -0700
- To: rbj@audioimagination.com
- Cc: "public-audio@w3.org" <public-audio@w3.org>
- Message-ID: <CA+EzO0nD_QKMbkvhMRsXKS_GonsV-oDLbRmhZ3zK6hcCWU7kew@mail.gmail.com>
On Thu, May 9, 2013 at 1:56 PM, <rbj@audioimagination.com> wrote: > > > Chris, i am studying the waveTable and OscillatorNode code. i'm sorta > assuming you're in the file's heritage, i dunno. > Yeah, I wrote it. > > > don't s'pose there's a pitch or period estimator in this, is there? > > are you guys doing any analysis tools that would take a recorded note and > extract either additive synth data (that can be fed to > createBandLimitedTables() or the extractive wavetable directly from the > pitch-tracked data? > > you might want the means to a more general wavetable definition than SINE, > SQUARE, SAWTOOTH, and TRIANGLE. it might be hard to find some of it, it's > very old (pre-1996), the computer that it lives on hasn't been turned on > for a while, but i did this before, including what you did defining > wavetables. and i might be able to find some MATLAB code of the same > vintage. > The idea is that you can do the analysis of a recorded note directly in JavaScript, then create a WaveTable object (which internally in WebKit/Blink would call into createBandLimitedTables(), but that's an implementation detail). So, in other words it already is more general than SINE, SQUARE, SAWTOOTH, and TRIANGLE, since you can create your own with the createWaveTable() method. The common waveforms are just there for convenience. > > > btw, i am curious about the cos(0.5 * omega) factors in SAWTOOTH and > TRIANGLE. (and not in SQUARE). what is that factor for? > Not sure off the top of my head, I think it just turned out to be the way the Fourier series worked out. > and i was looking for where CentsPerRange might be used. that would be > interesting code i presume building a "keymap" of ranges or whatever you > call the set of wavetables that correspond to different places on the > keyboard. in some semantics, a "wavetable" is an array of what we might be > calling a wavetable here (which is an array of a single period of samples). > Yes, sorry that the name WaveTable may have been a bit poorly chosen, and perhaps PeriodicWave might be better. I think there's an open bug about the naming here... I realize that it could be useful to have an array of these things. I think I remember reading a paper you wrote about this topic and the analysis techniques you used! > > > so you can have that array of wavetables with 2 or more dimensions ("slow > time", pitch, key velocity, key pressure, modwheel, pedal; joystick, > slider, i dunno whatever controller). and you can interpolate in any or > all of these dimensions, but it gets costly. you double the number of > linear-interpolations every time you add a dimension to interpolate in. > Yes, cool stuff. > > > r b-j > > > > > ---------------------------- Original Message ---------------------------- > Subject: Re: [Bug 21980] New: WaveTable is highly underspecified > From: "Chris Rogers" <crogers@google.com> > Date: Thu, May 9, 2013 2:57 pm > To: "Ehsan Akhgari" <ehsan.akhgari@gmail.com> > Cc: rbj@audioimagination.com > "public-audio@w3.org" <public-audio@w3.org> > -------------------------------------------------------------------------- > > > > On Thu, May 9, 2013 at 7:26 AM, Ehsan Akhgari <ehsan.akhgari@gmail.com > >wrote: > > > >> On Wed, May 8, 2013 at 11:54 PM, Chris Rogers <crogers@google.com> > wrote: > >> > >>> > >>> > >>> > >>> On Wed, May 8, 2013 at 8:46 PM, <rbj@audioimagination.com> wrote: > >>> > >>>> > >>>> > >>>> it seems to me that if the createWaveTable() method is, essentially, > an > >>>> inverse DFT. is it anything else? > >>>> > >>> That's the basic idea, but then this time-domain version has to be > >>> sampled at many different rates, so we want to avoid aliasing. In > WebKit > >>> (Blink is the same) we use a multi-table approach (similar to mipmaps > for > >>> graphics) where each table uses an inverse DFT, culling out an > appropriate > >>> number of aliasing harmonics. Then when we render the waveform, we > select > >>> which two adjacent tables to use, interpolate between those two, and > then > >>> use linear interpolation... > >>> > >>> > https://svn.webkit.org/repository/webkit/trunk/Source/WebCore/Modules/webaudio/WaveTable.cpp > >>> > >>> > https://svn.webkit.org/repository/webkit/trunk/Source/WebCore/Modules/webaudio/OscillatorNode.cpp > >>> > >>> We have tests where we load up a WaveTable and use it with an > >>> OscillatorNode, sweeping the frequency from very low to high (something > >>> like 10Hz -> 20KHz), checking that the aliasing isn't too bad... > >>> > >> > >> That is my point. :-) It should be possible to implement WaveTable (and > >> OscillatorNode) without linking to the WebKit implementation! > >> > > > > But I think the spec is very detailed and quite clear on what the > WaveTable > > and OscillatorNode should do mathematically. It doesn't dictate a > specific > > technique for achieving the effect. I only mentioned the WebKit code here > > as one possible practical approach. > > > > > > > >> > >> > >> -- > >> Ehsan > >> <http://ehsanakhgari.org/> > >> > > > >
Received on Thursday, 9 May 2013 21:42:14 UTC