W3C home > Mailing lists > Public > public-audio@w3.org > April to June 2013

Re: Oscillator syncing

From: Russell McClellan <russell@motu.com>
Date: Wed, 5 Jun 2013 18:53:51 -0400
Cc: Josh Nielsen <josh@joshontheweb.com>, "public-audio@w3.org" <public-audio@w3.org>
Message-Id: <F294DA2A-B750-4400-8745-5BF8DE931A1E@motu.com>
To: Chris Rogers <crogers@google.com>
While possible, it's not completely trivial to implement oscillator sync without causing aliasing.  Also, there are many other nice analog oscillator effects that are not implemented in the web audio API (PWM strikes me as the most important).  

Perhaps a more realistic approach, rather than to separately specify every oscillator effect a user might want, would be to implement "two dimensional" wavetables where there's a time axis and an "effect" axis.  For sync this "effect" would represent the ratio between master and slave osc, where for PWM it would be the pulse width.  It would cover most of these "synthesis" type oscillators while also elegantly solving aliasing issues.  I remember a discussion with Robert Bristow-Johnson about this quite recently on the list.  As Robert said then, implementations of these type of wavetables are both fairly straightforward and quite common.

While there's plenty of beauty in the idea of doing things as closely as possible to the "analog" way, there's a fair bit of precedence for doing analog-type effects using 2d wavetables - for a classic example see the PPG Wave digital synth, and more recently you might look to Native Instruments' Massive.

While we're talking about blue-sky ideas for synthesis effects (and maybe this falls into the "Web Audio 2.0" category), I think by far the most useful single node to add would be a "SpectralScriptProcessorNode" that would behave the same as the script processor node but would act on frequency-domain data rather than time-domain data.  The back-end for this is already more or less in place (consider the convolver node and the analyzer node), and it would provide a great deal of power to advanced users of the API.  Potential applications would be moog-style synth filters,  high quality additive synthesis, real-time pitch shifting, custom beat detection, and spectral morphing.  The only complication I can think of is the fact that the data would have to be complex numbers.

Thanks,
-Russell

On Jun 5, 2013, at 6:20 PM, Chris Rogers <crogers@google.com> wrote:

> 
> 
> 
> On Wed, Jun 5, 2013 at 2:28 PM, Josh Nielsen <josh@joshontheweb.com> wrote:
> Hey All,
> 
> It is a common technique with synthesizers to sync oscillators in a master/slave setup where the slave oscillator's waveform restarts every time the master oscillator's waveform restarts.  A lot of more complex and natural sounding timbres are possible this way.  I don't see any mention of this in the API spec.  Is this going to be added or is there some clever way to accomplish this with the existing api that I haven't thought of?
> 
> Indeed, this is a nice effect.  There's currently no way to do it, but we've discussed this before, and might want to add this ability.
> 
> One way we could approach it is to add a new attribute to OscillatorNode called .master (or something like that):
> 
> osc2.master = osc1;
> 
> By default, the attribute would be null and thus would just run normally...
> 
> Chris
>  
> 
> 
> -- 
> Thanks,
> Josh Nielsen
> @joshontheweb
> joshontheweb.com
> 


Received on Wednesday, 5 June 2013 22:54:20 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:03:18 UTC