RE: Music notation demo with live synthesized playback

> I certainly think it would be nice to have the option of 
> using a host synth, but I think the nature of the AudioNode
> API is such that it need not perform any worse than an
> OS-provided synth.

That may be... I wasn't suggesting there was a performance advantage,
just an "its already done" advantage...

> I would expect that wavetable synth libraries for AudioNodes would
> be available rather quickly since they will be very easy to build
> - we hacked a simple one up in a day - and we would certainly think
> of making ours a free open source project as we've done with
> StandingWave in Flash.

Okay, I can see how that might make it unnecessary to use the host
synth... but I don't know if there still aren't developers who would
rather use it... lots of desktop apps use host GM without complaints.

> From Noteflight's point of view the big disadvantage of built-in  
> synths is their lack of predictability.  The same GM instrument
> sounds completely different on various OS platforms.  Even though
> it would be easiest for us to send a MIDI stream to a host synth,
> we would rather download our own samples and control the synthesis
> so that all of our users are guaranteed to hear exactly the same
> thing. It also allows us to control and extend the repertoire of
> instruments that we offer beyond GM.

I understand your decision completely, and would not try to talk
you out of it. 

But lots of application developers use the native GM synths on the
Mac and Windows platforms (which come from the same supplier) if 
not so much yet on Android, so maybe not everyone feels that GM
performance is an issue that need be avoided.

Likewise, Android and Mac OS already support sample downloading 
natively via DLS-format wavetables and APIs (the Windows synth also
uses the DLS format but unfortunately the APIs to use that are not
included in the box), so for some developers the host APIs might be
an acceptable solution for avoiding host GM wavetables.

I'm not suggesting this approach is better than yours, only that
if it is possible to take advantage of host resources, it could
be beneficial to allow that...

> The simplest way in which I think a host synth ought to be made  
> available is via support in the <audio> tag for the MIDI  format,   
> which I think might fall outside the scope of this group.  (I've  
> haven't been in the group long enough to say that very confidently  
> though.)

Me either... and I'm not up to speed on what can be done with the
<audio> tag in HTML5... but I didn't think it could load synth
samples, and I'm not sure it works for playing back a MusicXML
score in the manner of your demo.

> I don't offhand see a good way to include host synths in the API  
> that's been developed here, unless they were to expose their audio  
> output as a set of samples that could be piped back out into the  
> AudioNode framework to be fed to some downstream node. That hasn't  
> been my experience with OS-level synth APIs.  

I see your point...

> Without a consistent medium of samples passing from node to node,
> host synths might wind up as a parallel set of constructs within
> this API that might not mesh cleanly with everything else.  It 
> seems like it could be worked in, but with a fair bit of effort 
> and disruption. It is not high on my list of things to do with 
> this API but I am sure it has value for some part of the audience.
> 
> Just my immediate thoughts and I'm curious what other folks on the  
> group think.

Me too...

Thanks,

Tom White
MMA

Received on Tuesday, 19 October 2010 01:37:08 UTC