Re: Adding MIDI APIs to Audio WG Charter (was: MIDI enumeration (was: Re: getUserMedia use cases))

Just want to say that at least ability to just _play_ Standard MIDI files inside browser would be very nice.

Maybe adding support for playing GM MIDI files with HTML5 <audio> element would be good beginning (apart from more complex things like exposing MIDI hardware inputs to web-application) that would open really new opportunities for sites publishing MIDI music and would increase usability for users. (Currently users are forced to download MIDI file and then play it with standalone player.)

Size of MIDI file is hundreds times smaller that Wave-like audio files, but while we can play latter right inside browser, we somehow can't do the same with MIDI files.

By the way, IE at least supports playing MIDI files via <bgsound> element (not perfect, but at least a _really available_ option), while other browsers do not _at all_. That's very sad. Hopefully the situation will start changing now.

Thanks.


01.02.2012, 22:58, "Doug Schepers" <schepers@w3.org>:
> Hi, folks-
>
> After discussion with the MIDI Manufacturers Association, we resolved at
> the Audio WG f2f to propose add a MIDI API to the Audio WG charter.
>
> Chris Wilson has been championing this, and has offered to be one of the
> editors of such a spec [1].
>
> I believe that the MIDI folks would welcome such an API, based on my
> conversations with them. š(Please correct me if I'm wrong, Tom.) šI also
> think there is interest in allowing for rendering of Standard MIDI Files
> (SMF), but that may be more a matter for an HTML spec (though perhaps
> they could be rendered through an audio API?).
>
> So, unless anyone in this group objects to this, I plan to amend the
> Audio WG charter (subject to AC approval, of course) to include:
> * connecting music controllers to the Web
> * exposing bidirectional messaging channels to devices
>
> (... or similar more appropriate wording).
>
> This would be a separate deliverable from any audio API currently under
> development.
>
> Thoughts?
>
> [1]
> http://lists.w3.org/Archives/Public/public-webevents/2011OctDec/0027.html
>
> Regards-
> -Doug
>
> On 2/1/12 1:24 PM, Chris Wilson wrote:
>
>> šForking subject.
>>
>> šSo I can see the following enumeration/creation scenarios for MIDI:
>>
>> š- Enumerating MIDI interfaces present on the device (e.g. I have several
>> šmulti-port MIDI interfaces present, this lets me enumerate each input or
>> šoutput, likely get unique IDs for each one (so a developer could
>> šmaintain MIDI config profile).
>> š- Creating a virtual output port (e.g. a developer wants to create a
>> šsoftware synth program that, while running, creates a new device output
>> šfor other programs to enumerate) with some form of unique ID
>> š- Creating a virtual input port (e.g. a developer wants to create a
>> šsequencer program that, while running, creates a new device input for
>> šother programs to enumerate) with some form of unique ID
>>
>> šQuestions for the MIDI devs here-
>> š- Is the "synth" output designation in Windows useful? šThe #voices,
>> štype of synth, etc? šSeems a bit overdone, to me. šIt would seem like
>> šbeing able to tell "this is a software synth" would be useful - although
>> š- How about the MIDI device manufacturer/product ID? šDriver version #?
>> š- Windows MIDI mapper. šThis always seemed overblown to me; and, of
>> šcourse, you can just use it as a device in Windows. šI don't think we
>> šneed special exposure, as in the Windows APIs. šThoughts?
>>
>> šMy off-the-cuff feeling, BTW, is that there should be a commonality of
>> špattern between audio port enumeration and MIDI port enumeration, but I
>> šthink they will end up as separate APIs.
>>
>> š-C
>>
>> šOn Wed, Feb 1, 2012 at 9:29 AM, Joseph Berkovitz <joe@noteflight.com
>> š<mailto:joe@noteflight.com>> wrote:
>>
>> šššššI also agree with Tom's suggestion that MIDI devices be considered
>> šššššas part of this sphere.
>>
>> ššššš...joe
>>
>> šššššOn Jan 31, 2012, at 7:25 PM, Tom White (MMA) wrote:
>>> šššššChris Rogers said
>>>
>>> šššššššššit would be good to have an introspection API to enumerate the
>>> šššššššššavailable audio devices for both audio input and output. šA
>>> šššššššššbuilt-in microphone would be one such device. šAlso commonly
>>> šššššššššavailable is the line-in analog audio input on a laptop or
>>> šššššššššdesktop computer. šAnd, of course, any externally connected
>>> šššššššššmulti-channel USB or Firewire audio devices. šSome of these
>>> šššššššššcan present eight (or more) simultaneous audio input and
>>> šššššššššoutput channels simultaneously.
>>> šššššššššIt's important to not consider audio input in isolation, but
>>> šššššššššalso audio output capabilities when enumerating the devices.
>>>
>>> šššššI can't help but point out that MIDI devices (software and
>>> šššššhardware ports) are commonly used for audible* input/output and
>>> šššššthus would also be a candidate for enumeration...
>>> šššššTom White
>>> šššššwww.midi.org <http://www.midi.org/>
>>> ššššš*I say "audible" instead of "audio" so there is no confusion
>>> šššššbetween MIDI and audio <g>
>> ššššš... . š. ššš. ššššššJoe
>>
>> ššššš*Joe Berkovitz*
>> šššššPresident
>>
>> ššššš*Noteflight LLC*
>> ššššš84 Hamilton St, Cambridge, MA 02139
>> šššššphone: +1 978 314 6271 <tel:%2B1%20978%20314%206271>
>> šššššwww.noteflight.com <http://www.noteflight.com>

Received on Thursday, 2 February 2012 00:02:35 UTC