Re: Music Synthesis

Hi Robert --

On 2010Jun 15, at 12:05 p, Robert O'Callahan wrote:
> On Wed, Jun 16, 2010 at 6:46 AM, Chris Grigg <chris@chrisgrigg.org> wrote:
> While JavaScript/ECMAscript is an extremely convenient execution environment in a browser, it's never previously been the go-to technology for music synthesis or event system implementations, due to its low efficiency compared with native implementations.  That's why historically standardized APIs for music and sound synthesis and processing have typically interfaced to native implementations (occasionally VM implementations ala Java), not interpreted implementations.  Consumer expectations for music synthesis tend to run to relatively high polyphony -- even mobile phones typically claim 48+ simultaneous voices (wavetable synthesis plus dynamic lowpass filter) -- and this would be difficult to achieve in any interpreted/scripting language, across a broad range of client device capabilities.
> 
> It's important to not make assumptions about JS performance. JS implementation technology is evolving rapidly. For example we have benchmarks showing a JS FFT competitive with the same code in C.
> https://bugzilla.mozilla.org/show_bug.cgi?id=490705#c49 (and following)
> In particular characterizing JS as "interpreted" or "scripting" and therefore locked into a given performance class is not helpful.

Thanks for the reminder and information.

Further to my comment on the separate "Target platforms?" thread, on what range of target platforms did you find the performance to be equivalent?  Are there any of our target platforms that are not capable of realtime performance with that FFT implementation?  I would worry about very skinny clients not only for FFT but for any compute intensive operations.

Asking another way: Are these accelerated JS implementation techniques you reference deployed for all of our target platforms?


> If we're talking about dedicated audio synthesis hardware on mobile devices, then by all means let's talk about it.

My experience with mobile music synthesis has been in pure software implementations using standardized content formats (see Beatnik).  IMHO I don't see making a W3C API dependent on music synth HW as a viable path due to the proprietary nature of such devices.

	-- Chris G.


> Rob
> -- 
> "He was pierced for our transgressions, he was crushed for our iniquities; the punishment that brought us peace was upon him, and by his wounds we are healed. We all, like sheep, have gone astray, each of us has turned to his own way; and the LORD has laid on him the iniquity of us all." [Isaiah 53:5-6]

Received on Tuesday, 15 June 2010 20:23:25 UTC