W3C home > Mailing lists > Public > public-audio@w3.org > January to March 2012

Re: Adding MIDI APIs to Audio WG Charter (was: MIDI enumeration (was: Re: getUserMedia use cases))

From: Robert O'Callahan <robert@ocallahan.org>
Date: Sat, 4 Feb 2012 12:26:17 +1300
Message-ID: <CAOp6jLatG=B97t-umHtRq+cF4GHjdZ254K5dgfLGEPqg5iYduQ@mail.gmail.com>
To: "Marat Tanalin | tanalin.com" <mtanalin@yandex.ru>
Cc: "Tom White (MMA)" <lists@midi.org>, Chris Wilson <cwilso@google.com>, Doug Schepers <schepers@w3.org>, Joseph Berkovitz <joe@noteflight.com>, Robin Berjon <robin@berjon.com>, public-audio@w3.org, Dom Hazael-Massieux <dom@w3.org>, jussi.kalliokoski@gmail.com
I think we need to separate the requirements here.

Joseph said that for applications like his, consistency of the synthesizer
is really important, so using different system synthesizers is not
acceptable. So for that, either browser all build in the same synthesizer
or we do our best to make JS synthesizers work well. (I hope the latter.)

Apparently consistency isn't as important to you, and you just want to play
MIDI files somehow. For that, adding MIDI as a supported media type and
using the system synthesizer (when available) makes sense.

Other people want to be able to manipulate real-time MIDI streams and
synthesize output from them. Where do those applications come down on
system synthesizer vs consistent synthesis?

"If we claim to be without sin, we deceive ourselves and the truth is not
in us. If we confess our sins, he is faithful and just and will forgive us
our sins and purify us from all unrighteousness. If we claim we have not
sinned, we make him out to be a liar and his word is not in us." [1 John
Received on Friday, 3 February 2012 23:26:54 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:49:57 UTC