Google's proposal to add MIDI API to Web Events WG [Was: Fwd: Re: Draft Updated Charter adding Mouse Lock and Gamepad]

Hi Chris, Raman,

I realize Chris' original thread is ongoing but I wanted to step back a 
bit with a few process-related questions ...

What are your thoughts about starting a Community Group first to build a 
community and a spec?

I understand Chris has some rationale against taking this API to the 
Audio WG (where Google is a member) but have you actually discussed this 
API with them? I'm wondering if their current charter "as is" could 
rationalize this API or if they would be interested in expanding their 
scope to accommodate this functionality.

Lastly, I would like to try to get a clearer sense regarding what other 
Members and non-Members support this proposal. So, folks - please speak up.

-Thanks, ArtB

-------- Original Message --------
Subject: 	Re: Draft Updated Charter adding Mouse Lock and Gamepad
Resent-Date: 	Tue, 4 Oct 2011 21:07:03 +0000
Resent-From: 	<public-webevents@w3.org>
Date: 	Tue, 4 Oct 2011 14:06:21 -0700
From: 	ext Chris Wilson <cwilso@google.com>
To: 	<public-webevents@w3.org>



I'd been talking with a variety of people about the need for a Music 
Controller API - i.e. MIDI input/output, so I can synchronize music 
apps, as well as interface my physical keyboard controllers, 
synthesizers and drum machines with the web platform.  After some 
thought, I'd like to propose that Music Device Communication be added to 
the Web Events charter - I believe the challenges of this API are quite 
similar to the Gamepad API (different API, but the same general kind of 
patterns, and heavily event-based). This would be the web platform's 
analog to CoreMIDI on MacOS/iOS, or the Windows MIDI API. Proposed 
charter text would read something like this:


      Music Device Communication

Some user agents have connected music devices, such as synthesizers, 
keyboard controllers and drum machines.  The widely adopted MIDI 
protocol enables electronic musical instruments, controllers and 
computers to communicate and synchronize with each other. MIDI does not 
transmit audio signals: instead, it sends event messages about musical 
notes, controller signals for parameters such as volume, vibrato and 
panning, cues and clock signals to set the tempo, and system-specific 
MIDI communications (e.g. to remotely store synthesizer-specific patch 
data).

This deliverable defines API support for the MIDI protocol and common 
music device scenarios in the web platform.

--------------

Some background why I think it belongs in the Events WG rather than the 
Audio WG:
MIDI is actually very much like game controllers in terms of being 
entirely event-based, and fundamentally being a model of sending 
uni-directional controller messages back and forth between devices.  In 
fact, Microsoft's Sidewinder Force Feedback Pro joystick (I still have 
one somewhere in my basement, I think 
-http://en.wikipedia.org/wiki/Microsoft_SideWinder#Joystick 
<http://en.wikipedia.org/wiki/Microsoft_SideWinder#Joystick>) - actually 
utilized the MIDI break-out pins of the typical analog game port 
(http://en.wikipedia.org/wiki/Game_port#MIDI_connectors 
<http://en.wikipedia.org/wiki/Game_port#MIDI_connectors>) on the sound 
card as a channel for digital data.

The significant challenges inherent in the Audio API - the 
sample-accurate scheduled playback model, the convolution engine and 
other "inline effects" - don't apply at all to a MIDI API, and there's 
not much in the way of shared concepts between audio and MIDI devices 
themselves (many audio adapters implement a MIDI port too, but it shows 
up as a completely separate device in MacOS/Windows).  MIDI is a very 
event-based protocol (I've written a bunch of MIDI software in the 
distant past, down to implementing a MIDI driver) - there's no 
scheduling, so you need to deliver events as they arrive.  (A 
millisecond here or there in MIDI isn't a big deal, whereas in audio 
gaps like that wouldn't be acceptable.)

A MIDI API, on the other hand, would (I expect) have some music-specific 
APIs (e.g. NoteOn()/NoteOff() kinda stuff), but mostly it's about 
plugging in event handlers for note on/off and continuous controller 
messages (and sending the same kinds of messages out, of course), as 
well as the configuration system for "which MIDI port should I use". 
  Those all seem like symmetric problems with the other Events APIs, 
particularly the Game Controller API.  If I didn't feel like the music 
controller API should have some music-specific APIs (e.g. the 
aforementioned noteon/off), I'd actually say it's just a slightly 
different game controller.  MIDI may be frequently used in conjunction 
with the Web Audio API in an actual app, but the use cases, scenarios 
and requirements are pretty different.

Obviously, I'd volunteer to edit (though I'm not tied to doing so 
either).  If others feel this fits better elsewhere, please give me an 
idea where.

Thanks!
-Chris

From: Arthur Barstow <art.barstow@nokia.com <mailto:art.barstow@nokia.com>>
Date: Tue, 27 Sep 2011 12:24:31 -0400
Message-ID: <4E81F8BF.1040507@nokia.com 
<mailto:4E81F8BF.1040507@nokia.com>>
To: ext Doug Schepers <schepers@w3.org <mailto:schepers@w3.org>>, 
public-webevents@w3.org <mailto:public-webevents@w3.org>

Doug - thanks; this looks good to me.

All - if you have any comments, please send them to public-webents as
soon as possible.

On 9/27/11 10:48 AM, ext Doug Schepers wrote:
 > Hi, Folks-
 >
 > I have made a first draft of a proposed WebEvents charter revision to
 > add the Mouse Lock API and Gamepad API specs.
 >
 > http://www.w3.org/2010/webevents/charter/2011/Overview.html
 >
 > I made minimal changes to bring these specs into scope, which should
 > make an easy AC review.
 >
 > Please review the draft charter and let me know what you think.
 >
 > Regards-
 > -Doug Schepers
 > W3C Developer Outreach
 > Project Coordinator, SVG, WebApps, Touch Events, and Audio WGs
 >

Received on Thursday, 6 October 2011 17:40:18 UTC