W3C home > Mailing lists > Public > public-audio@w3.org > January to March 2012

RE: ACTION-33: MIDI charter proposal

From: James Ingram <j.ingram@netcologne.de>
Date: Wed, 29 Feb 2012 12:27:18 +0100
Message-ID: <4F4E0B96.5060307@netcologne.de>
To: public-audio@w3.org

> Looks good to me, at first review.
>
> Tom White
> MMA
>
>
>    _____
>
> From: Chris Wilson [mailto:cwilso@google.com  <mailto:cwilso@google.com?Subject=RE%3A%20ACTION-33%3A%20MIDI%20charter%20proposal&In-Reply-To=%253C229C68ECFB434EE286A59D0FE10BC1BD%40TWTHINK%253E&References=%253C229C68ECFB434EE286A59D0FE10BC1BD%40TWTHINK%253E>]
>
> My goal in proposing a MIDI API was to provide the web platform's analog to
> CoreMIDI on MacOS/iOS, or the Windows MIDI api.  Here's a proposal for
> charter proposal text:
>
>
>
> MIDI Device Communication
>
>
> Some user agents have connected music devices, such as synthesizers,
> keyboard controllers and drum machines.  The widely adopted MIDI protocol
> enables electronic musical instruments, controllers and computers to
> communicate and synchronize with each other. MIDI does not transmit audio
> signals: instead, it sends event messages about musical notes, controller
> signals for parameters such as volume, vibrato and panning, cues and clock
> signals to set the tempo, and system-specific MIDI communications (e.g. to
> remotely store synthesizer-specific patch data).  Additionally, MIDI has
> become a standard for show control, lighting and special effects control.
>
> This deliverable defines an API supporting the MIDI protocol and MIDI device
> input and output selection.  This API should be defined at such a level as
> to enable non-music MIDI applications as well as music ones.
>
>
>
>
> -Chris

I've been thinking about where I'd like to see MIDI going here. This may 
or may not affect the wording of the charter, but perhaps a little 
brainstorming might help.

A MIDI API could have (at least) the following five sub-interfaces:
1. Input and Output Devices
2. JavaScript Synthesizers
3. Standard MIDI files
4. Synthesizer-specific files
5. MIDI Realtime Messages

---------------------------
1. Input and Output Devices
These are essential. Implementing them would require a small amount of 
code, and enable a huge amount of functionality.

---------------------------
2. JavaScript Synthesizers
Presumably these define their own APIs (which need not be MIDI). If a 
JavaScript synthesizer *is* a MIDI synthesizer, it should probably be 
allowed to say whether it wants to be included in the MIDI Output 
Devices enumeration or not. Some Javascript synthesizers will be 
controlled directly by other Javascript in the same file, and send their 
output directly to the sound system. Such synthesizers would not need to 
go via the MIDI API.

---------------------------
3. Standard MIDI Files
There should be a standard way to convert an SMF into a queue of MIDI 
messages waiting to be sent. Playing an SMF means sending the 
pre-composed messages at the times they define. (Is there a potential 
memory problem here with large files?)
The "standard way" could be JavaScript -- possibly supplied by the MIDI 
Manufacturer's Association -- or it could be code inside the browsers or 
operating systems themselves. That's something which would need discussing.

---------------------------
4. Synthesizer-specific files
As I understand it, these are simply Standard MIDI Files containing 
"System Exclusive" MIDI commands. So they don't need any special 
treatment in a W3C MIDI API. A "System Exclusive" message would be sent 
in the same way as any other MIDI message.

_______________
5. MIDI Realtime messages
MIDI devices are usually synchronized by allocating channels to them, 
and simply sending synchronous, channel-specific messages.
But MIDI Realtime messages can also be used to synchronize output 
devices. These include the MIDI Clock message, which is usually sent 
repeatedly (if at all) at short time intervals (of the order of .02 
milliseconds) . To keep memory requirements down, it would be better if 
MIDI Realtime messages were both generated and sent while the main 
message queue is being sent. Possibly in a different thread.
So the API needs a separate interface to these:
1. Should Realtime messages be sent at all? (no, by default)
2. Which Realtime messages should be sent, and when?
3. etc.

_______________
Here's a possible revision of Chris' proposed text (quoted above)

MIDI Device Communication

The MIDI protocol enables electronic musical instruments, controllers,
computers and other devices to communicate and synchronize with each other.
MIDI is not just the universally accepted standard for digital musical
instruments, it is also used for show control, lighting and special effects
control.
MIDI does not transmit audio signals: instead, it sends event messages about
musical notes and controllers such as volume, vibrato and panning. It can
also send real-time synchronization messages and system-exclusive messages
for particular brands of output device.

This deliverable defines an API supporting the MIDI protocol and MIDI device
input and output selection.  This API should be defined at such a level as
to enable non-music MIDI applications as well as music ones.



Hope that helps,
best wishes,
James Ingram

-- 
www.james-ingram-act-two.de
Received on Wednesday, 29 February 2012 11:28:00 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 29 February 2012 11:28:02 GMT