- From: Silvia Pfeiffer <silviapfeiffer1@gmail.com>
- Date: Sun, 9 Aug 2009 11:16:01 +1000
On Sun, Aug 9, 2009 at 3:15 AM, Chris McCormick<chris at mccormick.cx> wrote: > On Wed, Jul 08, 2009 at 09:24:42AM -0700, Charles Pritchard wrote: >> There are two use cases that I think are important: a codec >> implementation (let's use Vorbis), >> and an accessibility implementation, working with a <canvas> element. > > Here are a few more use-cases that many people would consider just as > important: > > * Browser based music software and synthesis toys. > * New types of 'algorithmic' music like that pioneered by Brian Eno. > * Browser based games which want to use procedural audio instead of > pre-rendered sound effects. > > I'd like to reiterate the previously expressed sentiment that only implementing > pre-rendered audio playback is like having a browser that only supports static > images loaded from the server instead of animations and <canvas> tags. > > What is really needed is a DSP vector processor which runs outside of ECMA > script, but with a good API so that the ECMAscripts can talk to it directly. > Examples of reference software, mostly open source, which do this type of thing > follow: > > * Csound > * Supercollider > * Pure Data > * Nyquist > * Chuck > * Steinberg VSTs > > I am going to use the terms "signal vector", "audio buffer", and "array" > interchangeably below. > > Four major types of synthesis would be useful, but they are pretty much > isomorphic, so any one of them could be implemented as a base-line: > > * Wavetable (implement vector write/read/lookup operators) > * FM & AM (implement vector + and * operators) > * Subtractive (implement unit delay from which you can build filters) > * Frequency domain (implemnt FFT and back again) > > Of these, I feel that wavetable synthesis should be the first type of synthesis > to be implemented, since most of the code for manipulating audio buffers is > already going to be in the browsers and exposing those buffers shouldn't be > hugely difficult. Basically what this would take is ensuring some things about > the audio tag: > > * Supports playback of arbitrarily small buffers. > * Seamlessly loops those small buffers. > * Allows read/write access to those buffers from ECMAscript. > > Given the above, the other types of synthesis are possible, albeit slowly. For > example, FM & AM synthesis are possible by adding adding/multiplying vectors of > sine data together into a currently looping audio buffer. Subtractive synthesis > is possible by adding delayed versions of the data in the buffer to itself. > Frequency domain synthesis is possible by analysing the data in the buffer with > FFT (and reverse FFT) and writing back new data. I see this API as working as > previously posted, by Charles Prichard, but with the following extra > possibility: > > <audio id='mybuffer'> > buffer = document.getElementById("mybuffer"); > // here myfunc is a function which will change > // the audio buffer each time the buffer loops > buffer.loopCallback = myfunc; > buffer.loop = True; > buffer.play(); > > Of course, the ECMA script is probably going to be too slow in the short term, > so moving forward it would be great if there was a library/API which can do the > following vector operations in the background at a speed faster than doing them > directly, element by element inside ECMAscript (a bit like Python's Numeric > module). All inputs and outputs are signal vectors/audio tag buffers: > > * + - add two signal vectors (2 input, 1 output) > * * - multiply two signal vectors (2 input, 1 output) > * z - delay a signal vector with customisable sample length (2 input, 1 output) > * read - do a table lookup (1 input, 1 output) > * write - do a table write (2 input, 1 output) > * copy - memcpy a signal vector (1 input, 1 output) > * fft do a fast fourier transform - (1 input, 2 output) > * rfft do a reverse fast fourier transform - (2 inputs, 1 output) > > It would be so great if it was possible to unify the above into an API that > looked and worked something like this: > > <audio id='mybuffer'> > > outbuffer = document.getElementById("mybuffer"); > > b = new AudioBuffer(64) > for (x=0; x<64; x++) > ? ? ? ?b[x] = Math.sin(x / 64 * Math.PI)a; > > // inside the loopCallback do a vector multiplication of the data in our buffer > // with a sine wave we created earlier. > outbuffer.multiply(b); > Why don't you just implement an example in javascript to show off what you're talking about and make a use case for having it implemented inside the browsers? Cheers, Silvia,
Received on Saturday, 8 August 2009 18:16:01 UTC