W3C home > Mailing lists > Public > public-audio@w3.org > January to March 2012

Re: Adding MIDI APIs to Audio WG Charter (was: MIDI enumeration (was: Re: getUserMedia use cases))

From: Marat Tanalin | tanalin.com <mtanalin@yandex.ru>
Date: Thu, 02 Feb 2012 04:01:52 +0400
To: Doug Schepers <schepers@w3.org>
Cc: Chris Wilson <cwilso@google.com>,Joseph Berkovitz <joe@noteflight.com>,"Tom White (MMA)" <lists@midi.org>,Robin Berjon <robin@berjon.com>,public-audio@w3.org,Dom Hazael-Massieux <dom@w3.org>,jussi.kalliokoski@gmail.com
Message-Id: <486491328140912@web2.yandex.ru>
Just want to say that at least ability to just _play_ Standard MIDI files inside browser would be very nice.

Maybe adding support for playing GM MIDI files with HTML5 <audio> element would be good beginning (apart from more complex things like exposing MIDI hardware inputs to web-application) that would open really new opportunities for sites publishing MIDI music and would increase usability for users. (Currently users are forced to download MIDI file and then play it with standalone player.)

Size of MIDI file is hundreds times smaller that Wave-like audio files, but while we can play latter right inside browser, we somehow can't do the same with MIDI files.

By the way, IE at least supports playing MIDI files via <bgsound> element (not perfect, but at least a _really available_ option), while other browsers do not _at all_. That's very sad. Hopefully the situation will start changing now.


01.02.2012, 22:58, "Doug Schepers" <schepers@w3.org>:
> Hi, folks-
> After discussion with the MIDI Manufacturers Association, we resolved at
> the Audio WG f2f to propose add a MIDI API to the Audio WG charter.
> Chris Wilson has been championing this, and has offered to be one of the
> editors of such a spec [1].
> I believe that the MIDI folks would welcome such an API, based on my
> conversations with them. (Please correct me if I'm wrong, Tom.) I also
> think there is interest in allowing for rendering of Standard MIDI Files
> (SMF), but that may be more a matter for an HTML spec (though perhaps
> they could be rendered through an audio API?).
> So, unless anyone in this group objects to this, I plan to amend the
> Audio WG charter (subject to AC approval, of course) to include:
> * connecting music controllers to the Web
> * exposing bidirectional messaging channels to devices
> (... or similar more appropriate wording).
> This would be a separate deliverable from any audio API currently under
> development.
> Thoughts?
> [1]
> http://lists.w3.org/Archives/Public/public-webevents/2011OctDec/0027.html
> Regards-
> -Doug
> On 2/1/12 1:24 PM, Chris Wilson wrote:
>> Forking subject.
>> So I can see the following enumeration/creation scenarios for MIDI:
>> - Enumerating MIDI interfaces present on the device (e.g. I have several
>> multi-port MIDI interfaces present, this lets me enumerate each input or
>> output, likely get unique IDs for each one (so a developer could
>> maintain MIDI config profile).
>> - Creating a virtual output port (e.g. a developer wants to create a
>> software synth program that, while running, creates a new device output
>> for other programs to enumerate) with some form of unique ID
>> - Creating a virtual input port (e.g. a developer wants to create a
>> sequencer program that, while running, creates a new device input for
>> other programs to enumerate) with some form of unique ID
>> Questions for the MIDI devs here-
>> - Is the "synth" output designation in Windows useful? The #voices,
>> type of synth, etc? Seems a bit overdone, to me. It would seem like
>> being able to tell "this is a software synth" would be useful - although
>> - How about the MIDI device manufacturer/product ID? Driver version #?
>> - Windows MIDI mapper. This always seemed overblown to me; and, of
>> course, you can just use it as a device in Windows. I don't think we
>> need special exposure, as in the Windows APIs. Thoughts?
>> My off-the-cuff feeling, BTW, is that there should be a commonality of
>> pattern between audio port enumeration and MIDI port enumeration, but I
>> think they will end up as separate APIs.
>> -C
>> On Wed, Feb 1, 2012 at 9:29 AM, Joseph Berkovitz <joe@noteflight.com
>> <mailto:joe@noteflight.com>> wrote:
>> I also agree with Tom's suggestion that MIDI devices be considered
>> as part of this sphere.
>> ...joe
>> On Jan 31, 2012, at 7:25 PM, Tom White (MMA) wrote:
>>> Chris Rogers said
>>> it would be good to have an introspection API to enumerate the
>>> available audio devices for both audio input and output. A
>>> built-in microphone would be one such device. Also commonly
>>> available is the line-in analog audio input on a laptop or
>>> desktop computer. And, of course, any externally connected
>>> multi-channel USB or Firewire audio devices. Some of these
>>> can present eight (or more) simultaneous audio input and
>>> output channels simultaneously.
>>> It's important to not consider audio input in isolation, but
>>> also audio output capabilities when enumerating the devices.
>>> I can't help but point out that MIDI devices (software and
>>> hardware ports) are commonly used for audible* input/output and
>>> thus would also be a candidate for enumeration...
>>> Tom White
>>> www.midi.org <http://www.midi.org/>
>>> *I say "audible" instead of "audio" so there is no confusion
>>> between MIDI and audio <g>
>> ... . . . Joe
>> *Joe Berkovitz*
>> President
>> *Noteflight LLC*
>> 84 Hamilton St, Cambridge, MA 02139
>> phone: +1 978 314 6271 <tel:%2B1%20978%20314%206271>
>> www.noteflight.com <http://www.noteflight.com>
Received on Thursday, 2 February 2012 00:02:35 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:49:57 UTC