Re: Adding MIDI APIs to Audio WG Charter (was: MIDI enumeration (was: Re: getUserMedia use cases))

04.02.2012, 03:26, "Robert O'Callahan" <robert@ocallahan.org>:
> I think we need to separate the requirements here.
>
> Joseph said that for applications like his, consistency of the synthesizer is really important, so using different system synthesizers is not acceptable. So for that, either browser all build in the same synthesizer or we do our best to make JS synthesizers work well. (I hope the latter.)
>
> Apparently consistency isn't as important to you, and you just want to play MIDI files somehow. For that, adding MIDI as a supported media type and using the system synthesizer (when available) makes sense.
>
> Other people want to be able to manipulate real-time MIDI streams and synthesize output from them. Where do those applications come down on system synthesizer vs consistent synthesis?
>
> Rob
> --
> "If we claim to be without sin, we deceive ourselves and the truth is not in us. If we confess our sins, he is faithful and just and will forgive us our sins and purify us from all unrighteousness. If we claim we have not sinned, we make him out to be a liar and his word is not in us." [1 John 1:8-10]

It makes sense to take into account that GM standard itself defines just set of general timbres and does not regulate their exact sounding, so different sounding depending on specific GM device is perfectly acceptable for GM. If _some_ web-applications need 100% consistency across browsers and platforms, authors of that applications are free to implement their own pure-script synths _simultaneously_ with having ability to use universal system GM synth in all other, more general usecases.

Received on Saturday, 4 February 2012 00:11:15 UTC