W3C home > Mailing lists > Public > public-webevents@w3.org > October to December 2011

Re: Draft Updated Charter adding Mouse Lock and Gamepad

From: Chris Wilson <cwilso@google.com>
Date: Wed, 5 Oct 2011 11:30:04 -0700
Message-ID: <CAJK2wqW_GaQ_=KhHOWFZ1gMgR2erfErUiAwNkVo47_piSbqqeg@mail.gmail.com>
To: Olli@pettay.fi
Cc: public-webevents@w3.org, "Robert O'Callahan" <robert@ocallahan.org>
On Wed, Oct 5, 2011 at 1:19 AM, Olli Pettay <Olli.Pettay@helsinki.fi> wrote:

> On 10/05/2011 03:11 AM, Chris Wilson wrote:
>> I think you're thinking of an API that would say "play this .MID file".
> I'm not, at least not exactly.
> I'm thinking about various DAW use cases, where one can record and play
> simultaneously several audio and midi tracks.
> One would somehow be able to convert the midi input to a stream which
> could be synchronized and processed like simultaneously playing audio
> track.

In practice, those DAW cases are going to have significantly custom code
anyway (e.g. for configuration of multiple MIDI I/Os), and they're going to
want to define their own formats (for multi-interface I/O - .MID files are
single-interface).  I could see wanting to interface the MIDI interface
selection (if you have multiple MIDI interfaces) with the Stream case (which
is just a devolved <audio src=file.mid> imo, as I said previously), but
frankly, no one has thought this interesting enough to do in a desktop OS
yet either.  Windows has a "default MIDI interface"; that's it. MIDI has
sixteen channels, which is generally more than enough to capture the bulk of
songs (note that's channels, not polyphony).  If all you're doing is
streaming a straight MIDI file, I'd expect it to frankly just work through
the streaming system as any other audio file.

synchronization.  That's not as hard as it sounds, since MIDI is really
>> just a relatively quite sparse string of events (messages) to fire
>> anyway.
> Well, comparing to common event handling there can be a lot
> more midi messages than say mousemove events.
> Just use some pitch bend controller or other similar to create
> controller messages (and this all for several [up to 16 per midi device]
> channels).

Technically, that could be true.  In practice, 1) gaming mice (e.g.
http://www.logitech.com/en-us/mice-pointers/mice/devices/5750) can send
upwards of 1000 updates a second, and 2) MIDI was designed in 1982 as a
format that frequently still goes over a physical cable running a serial
protocol at 31.25kb.  That equates down to a bit under 4,000 bytes per
second.  Most common MIDI messages (key on/off and pressure/pitchbend) are 3
bytes long.  That puts you just a little over 1000 messages a second - and
that's the theoretical maximum.  In practice - a pretty complete
orchestration is just not that big.  A complete orchestration of a 4-minute
pop song is in the range of 100k as a MIDI file - that's less than 1/4 of
that density.

Yes, it's possible that you would have a dozen devices (I do at home), all
on separate MIDI interfaces, all 16-channel polytonic, all running full
density; that's like saying I could have 16 gamepads plugged in, all
wiggling the joystick at the same time. In practice, that's just not the
kind of density you'd get.  Remember, the controller messages are all 7-bit;
they just don't update as much as you'd think, and controllers like
pitchbend aren't continuously streaming throughout an entire song at high

But I'm certainly not an expert in Midi. Just someone who has used
> some DAW softwares.
> Would be good to ask some DAW vendor how they handle synchronization of
> audio/video and midi.

Yup, I'd like to do that - and I think Doug has a contact there.  But I
don't think we need to start with that as the design of a low-level MIDI i/o
API.  Windows and Mac don't.
Received on Wednesday, 5 October 2011 18:30:36 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:03:54 UTC