- From: Jussi Kalliokoski <jussi.kalliokoski@gmail.com>
- Date: Mon, 6 Feb 2012 18:28:20 +0200
- To: Joseph Berkovitz <joe@noteflight.com>
- Cc: James Ingram <j.ingram@netcologne.de>, public-audio@w3.org
- Message-ID: <CAJhzemVBb8tyR29=bzdnYSrYnfxBGA1yquDD1NbNEGfVkvHccw@mail.gmail.com>
I put together a gist in the form of IDL of what MIDI in the browser could look like, respective of MediaStreams Processing API and getUserMedia API. Excuse my weak skills in IDL, I hope it suffices to introduce my idea. It's also a bit incomplete, for example, it doesn't describe how to actually push the midi stream to the MediaStreamProcessor, because I haven't thought of a good way to do it yet. I also included an example usage. https://gist.github.com/1752949 Feedback appreciated. Jussi On Mon, Feb 6, 2012 at 4:49 PM, Joseph Berkovitz <joe@noteflight.com> wrote: > There is no single answer to this question. I don't disagree with the fact > that *some* applications will not be sensitive to synth consistency, but it > is absolutely the case that others will require it. > > At the same time, any prospective MIDI API need not provide a > guaranteed-consistent synthesizer. It's fair to ask any application that > wants this level of consistency to provide its own JS-based synth (and the > viability of doing so, on top of the Web Audio API at least, is proven). > > By the way, it is not the case that any JS-based wavetable synth is > fundamentally flawed idea as per Marat's previous post. It is not a good > decision for every application but if one is willing to live within certain > constraints and also make use of local storage, it can definitely be done. > > ...Joe > > > On Feb 6, 2012, at 4:51 AM, James Ingram wrote: > > Robert O'Callahan said > > Other people want to be able to manipulate real-time MIDI streams and > synthesize output from them. Where do those applications come down on > system synthesizer vs consistent synthesis? > > > I don't think its important to guarantee that exactly the same audio > results from playing a particular MIDI stream on different systems. > > Many more application scenarios are possible if browsers allow their users > to decide which of their installed MIDI output devices they want to use, by > listing them in a preferences dialog. On Microsoft OSs the "Microsoft GS > Wavetable Synth" would appear there, and would be the default synthesizer. > But if I've got a better piano installed, then I want to be allowed to use > it. If I want to link up with a lighting system, then I want to be able to > do that. If I've got the Vienna Philharmonic installed, I may want to use > that. > > So even if there was a common default synthesizer across all browsers, > there's no guarantee that users would actually be using it. > > > > > Marat Tanalin said > > It makes sense to take into account that GM standard itself defines just > set of general timbres and does not regulate their exact sounding, so > different sounding depending on specific GM device is perfectly acceptable > for GM. If _some_ web-applications need 100% consistency across browsers > and platforms, authors of that applications are free to implement their own > pure-script synths _simultaneously_ with having ability to use universal > system GM synth in all other, more general usecases. > > Exactly. > > I don't think its too much to ask of an operating system, that it should > supply a default software synthesizer in the way that Microsoft does. > Perhaps Microsoft could be persuaded to make their's open source... :-) > > James > > > > > > ... . . . Joe > > *Joe Berkovitz* > President > > *Noteflight LLC* > 84 Hamilton St, Cambridge, MA 02139 > phone: +1 978 314 6271 > www.noteflight.com > >
Received on Monday, 6 February 2012 16:28:51 UTC