Re: Music notation demo with live synthesized playback

> This looks like a great application for the tools we're discussing  
> here.
> But I also have to ask... is it possible to instead use the host  
> synth for playback (when one exists)?
> For example, Windows (since 1995), Mac OS (since about then), and  
> Android OS all come with an SMF player and a GM synth... and it  
> might be nice to also allow developers to use those tools rather  
> than always having to create their own wavetable synth and file  
> player...
>
> Thoughts?
>
> Tom White
> MMA


I certainly think it would be nice to have the option of using a host  
synth, but I think the nature of the AudioNode API is such that it  
need not perform any worse than an OS-provided synth.  I would expect  
that wavetable synth libraries for AudioNodes would be available  
rather quickly since they will be very easy to build - we hacked a  
simple one up in a day - and we would certainly think of making ours a  
free open source project as we've done with StandingWave in Flash.

 From Noteflight's point of view the big disadvantage of built-in  
synths is their lack of predictability.  The same GM instrument sounds  
completely different on various OS platforms.  Even though it would be  
easiest for us to send a MIDI stream to a host synth, we would rather  
download our own samples and control the synthesis so that all of our  
users are guaranteed to hear exactly the same thing. It also allows us  
to control and extend the repertoire of instruments that we offer  
beyond GM.

The simplest way in which I think a host synth ought to be made  
available is via support in the <audio> tag for the MIDI  format,   
which I think might fall outside the scope of this group.  (I've  
haven't been in the group long enough to say that very confidently  
though.)

I don't offhand see a good way to include host synths in the API  
that's been developed here, unless they were to expose their audio  
output as a set of samples that could be piped back out into the  
AudioNode framework to be fed to some downstream node. That hasn't  
been my experience with OS-level synth APIs.  Without a consistent  
medium of samples passing from node to node, host synths might wind up  
as a parallel set of constructs within this API that might not mesh  
cleanly with everything else.  It seems like it could be worked in,  
but with a fair bit of effort and disruption.  It is not high on my  
list of things to do with this API but I am sure it has value for some  
part of the audience.

Just my immediate thoughts and I'm curious what other folks on the  
group think.

... .  .    .       Joe

Joe Berkovitz
President
Noteflight LLC
160 Sidney St, Cambridge, MA 02139
phone: +1 978 314 6271
www.noteflight.com

Received on Tuesday, 19 October 2010 00:37:35 UTC