W3C home > Mailing lists > Public > public-webevents@w3.org > October to December 2011

Re: Draft Updated Charter adding Mouse Lock and Gamepad

From: Olli Pettay <Olli.Pettay@helsinki.fi>
Date: Wed, 05 Oct 2011 11:19:29 +0300
Message-ID: <4E8C1311.6030801@helsinki.fi>
To: Chris Wilson <cwilso@google.com>
CC: public-webevents@w3.org, "Robert O'Callahan" <robert@ocallahan.org>
On 10/05/2011 03:11 AM, Chris Wilson wrote:
> I think you're thinking of an API that would say "play this .MID file".
I'm not, at least not exactly.
I'm thinking about various DAW use cases, where one can record and play
simultaneously several audio and midi tracks.

One would somehow be able to convert the midi input to a stream which
could be synchronized and processed like simultaneously playing audio

>   That is one possible function - but typically, that's handled at a
> media player level.  (I.e., I'd expect that this the same problem as
> synchronizing a separate audio track to a video track - in fact, I would
> expect this to be accomplished with "<audio src='playme.mid'>", or
> accomplished via the normal media streams synchronization as just
> another type of audio track.)  That's not the same scenario and set of
> use cases as the more common "I want to implement a
> synthesizer/controller/sequencer and communicate with the real world."
> Both Windows and Mac (as I understand it) allow you to buffer-send MIDI
> packets with timestamped "streams".  Neither platform, to my knowledge,
> enables synchronization with simultaneous audio/video through this
> mechanism; if you had a real need for this kind of synchronization
> (e.g., you're building a video production app and want to synchronize
> outbound MIDI), you would ignore the buffer send and send the messages
> at the appropriate time yourself - i.e. roll your own cross-media
> synchronization.  That's not as hard as it sounds, since MIDI is really
> just a relatively quite sparse string of events (messages) to fire
> anyway.
Well, comparing to common event handling there can be a lot
more midi messages than say mousemove events.
Just use some pitch bend controller or other similar to create
controller messages (and this all for several [up to 16 per midi device] 

But I'm certainly not an expert in Midi. Just someone who has used
some DAW softwares.
Would be good to ask some DAW vendor how they handle synchronization of 
audio/video and midi.


  Both platforms expect you to use different services to play MIDI
> files out ("You should use the MCI MIDI sequencer to play MIDI files
> whenever you can" -
> http://msdn.microsoft.com/en-us/library/dd743676(v=VS.85).aspx).
> The far more interesting space to me is the interactive processing and
> dispatching of MIDI messages, that would let me build synthesizers in
> HTML/CSS/JS (like http://www.audiotool.com/app/, but not with Flash) and
> use physical MIDI controllers to play them.  The most interesting use
> cases here to me - a la the couple hundred bucks of iPad music apps I've
> bought in the last year - is in the realtime production and consumption
> of MIDI to use real controllers with the music apps.  Now that CoreMIDI
> is not only supported on iOS, but is starting to be supported in apps, I
> can plug a USB keyboard directly into my iPad (via the camera connection
> kit) and play a world-class organ simulation, play an emulation of a
> Korg analog synth, have a sequencer on one device play instruments on
> another, or simply record my keyboard playing into the GarageBand
> sequencer.  This is all done through fairly low-level controller event
> handling and message sending, which is the part that I think is most
> important to fit together with the rest of how events and messages are
> used in the web.
> On Tue, Oct 4, 2011 at 2:27 PM, Olli Pettay <Olli.Pettay@helsinki.fi
> <mailto:Olli.Pettay@helsinki.fi>> wrote:
>     Since MIDI must to be synchronized with audio and video, I'd assume
>     the API would need to be somewhere close to Media Stream APIs.
>     So, I believe MIDI has in fact quite different requirements than
>     Gamepad API.
>     -Olli
>     On 10/05/2011 12:06 AM, Chris Wilson wrote:
>         I'd been talking with a variety of people about the need for a Music
>         Controller API - i.e. MIDI input/output, so I can synchronize music
>         apps, as well as interface my physical keyboard controllers,
>         synthesizers and drum machines with the web platform.  After some
>         thought, I'd like to propose that Music Device Communication be
>         added to
>         the Web Events charter - I believe the challenges of this API
>         are quite
>         similar to the Gamepad API (different API, but the same general
>         kind of
>         patterns, and heavily event-based). This would be the web platform's
>         analog to CoreMIDI on MacOS/iOS, or the Windows MIDI API. Proposed
>         charter text would read something like this:
>               Music Device Communication
>         Some user agents have connected music devices, such as synthesize
Received on Wednesday, 5 October 2011 08:20:14 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:03:54 UTC