Re: Oscillator syncing

On Wed, Jun 5, 2013 at 3:53 PM, Russell McClellan <russell@motu.com> wrote:

> While possible, it's not completely trivial to implement oscillator sync
> without causing aliasing.  Also, there are many other nice analog
> oscillator effects that are not implemented in the web audio API (PWM
> strikes me as the most important).
>
> Perhaps a more realistic approach, rather than to separately specify every
> oscillator effect a user might want, would be to implement "two
> dimensional" wavetables where there's a time axis and an "effect" axis.
>  For sync this "effect" would represent the ratio between master and slave
> osc, where for PWM it would be the pulse width.  It would cover most of
> these "synthesis" type oscillators while also elegantly solving aliasing
> issues.  I remember a discussion with Robert Bristow-Johnson about this
> quite recently on the list.  As Robert said then, implementations of these
> type of wavetables are both fairly straightforward and quite common.
>
> While there's plenty of beauty in the idea of doing things as closely as
> possible to the "analog" way, there's a fair bit of precedence for doing
> analog-type effects using 2d wavetables - for a classic example see the PPG
> Wave digital synth, and more recently you might look to Native Instruments'
> Massive.
>

Sounds good to me!

For now, we're trying to go a bit slowly now that Mozilla is finishing up
their implementation, but I'm definitely interested in this and think it
would be valuable.


> While we're talking about blue-sky ideas for synthesis effects (and maybe
> this falls into the "Web Audio 2.0" category), I think by far the most
> useful single node to add would be a "SpectralScriptProcessorNode" that
> would behave the same as the script processor node but would act on
> frequency-domain data rather than time-domain data.  The back-end for this
> is already more or less in place (consider the convolver node and the
> analyzer node), and it would provide a great deal of power to advanced
> users of the API.  Potential applications would be moog-style synth
> filters,  high quality additive synthesis, real-time pitch shifting, custom
> beat detection, and spectral morphing.  The only complication I can think
> of is the fact that the data would have to be complex numbers.
>

In a previous life, I actually worked on a phase vocoder engine (SVP and
AudioSculpt) at IRCAM, so know a bit about this.  It becomes a bit tricky
when you have some nodes which work with time-domain data, and some with
frequency-domain, connecting them, etc.  So it could be tricky to spec this
well.


>
> Thanks,
> -Russell
>
> On Jun 5, 2013, at 6:20 PM, Chris Rogers <crogers@google.com> wrote:
>
>
>
>
> On Wed, Jun 5, 2013 at 2:28 PM, Josh Nielsen <josh@joshontheweb.com>wrote:
>
>> Hey All,
>>
>> It is a common technique with synthesizers to sync oscillators in a
>> master/slave setup where the slave oscillator's waveform restarts every
>> time the master oscillator's waveform restarts.  A lot of more complex and
>> natural sounding timbres are possible this way.  I don't see any mention of
>> this in the API spec.  Is this going to be added or is there some clever
>> way to accomplish this with the existing api that I haven't thought of?
>>
>
> Indeed, this is a nice effect.  There's currently no way to do it, but
> we've discussed this before, and might want to add this ability.
>
> One way we could approach it is to add a new attribute to OscillatorNode
> called .master (or something like that):
>
> osc2.master = osc1;
>
> By default, the attribute would be null and thus would just run normally...
>
> Chris
>
>
>>
>>
>> --
>> Thanks,
>> Josh Nielsen
>> @joshontheweb <http://twitter.com/joshontheweb>
>> joshontheweb.com
>>
>
>
>

Received on Wednesday, 5 June 2013 23:11:02 UTC