Re: Adding MIDI APIs to Audio WG Charter (was: MIDI enumeration (was: Re: getUserMedia use cases))

I think we need to separate the requirements here.

Joseph said that for applications like his, consistency of the synthesizer
is really important, so using different system synthesizers is not
acceptable. So for that, either browser all build in the same synthesizer
or we do our best to make JS synthesizers work well. (I hope the latter.)

Apparently consistency isn't as important to you, and you just want to play
MIDI files somehow. For that, adding MIDI as a supported media type and
using the system synthesizer (when available) makes sense.

Other people want to be able to manipulate real-time MIDI streams and
synthesize output from them. Where do those applications come down on
system synthesizer vs consistent synthesis?

Rob
-- 
"If we claim to be without sin, we deceive ourselves and the truth is not
in us. If we confess our sins, he is faithful and just and will forgive us
our sins and purify us from all unrighteousness. If we claim we have not
sinned, we make him out to be a liar and his word is not in us." [1 John
1:8-10]

Received on Friday, 3 February 2012 23:26:54 UTC