W3C home > Mailing lists > Public > public-audio@w3.org > January to March 2012

Re: ACTION-33: MIDI charter proposal

From: Chris Wilson <cwilso@google.com>
Date: Wed, 29 Feb 2012 10:10:22 -0800
Message-ID: <CAJK2wqUT-BsWCyO3cSzrQnbbpE1b+3OCvbpJNFWAL8j7xY8WWQ@mail.gmail.com>
To: James Ingram <j.ingram@netcologne.de>
Cc: public-audio@w3.org
James, I'm going to flip the order of your mail to talk about the charter
proposal first, hope you don't mind.

cues and clock
signals to set the tempo,

On Wed, Feb 29, 2012 at 3:27 AM, James Ingram <j.ingram@netcologne.de>wrote:

> Here's a possible revision of Chris' proposed text (quoted above)
>
> MIDI Device Communication
>
> The MIDI protocol enables electronic musical instruments, controllers,
> computers and other devices to communicate and synchronize with each other.
> MIDI is not just the universally accepted standard for digital musical
> instruments, it is also used for show control, lighting and special effects
> control.
>
> MIDI does not transmit audio signals: instead, it sends event messages
> about
> musical notes and controllers such as volume, vibrato and panning. It can
> also send real-time synchronization messages and system-exclusive messages
> for particular brands of output device.
>
> This deliverable defines an API supporting the MIDI protocol and MIDI
> device
> input and output selection.  This API should be defined at such a level as
> to enable non-music MIDI applications as well as music ones.
>

I think this wording is totally fine, except I would avoid saying
"real-time synchronization" - only because real-time has specific meaning
in some contexts that may not apply in MIDI (given the physical transport
for MIDI).  Also, the way that sentence is structured might be construed to
imply that the synchronization is manufacturer-specific, which is not true.
 I would change that sentence to read: "It can also send cues, tempo and
clock signals to synchronize sequencing tasks across devices, as well as
encapsulate system-exclusive messages that only apply to particular brands
of devices (e.g. transfer of synthesizer-specific patch or setup
information)."

Complete text:

MIDI Device Communication

The MIDI protocol enables electronic musical instruments,
controllers, computers and other devices to communicate and synchronize
with each other.  MIDI is not just the universally accepted standard for
digital musical instruments, it is also used for show control, lighting and
special effects control.

MIDI does not transmit audio signals: instead, it sends event messages
about musical notes and controllers such as volume, vibrato and panning. It
can also send cues, tempo and clock signals to synchronize sequencing tasks
across devices, as well as encapsulate system-exclusive messages that only
apply to particular brands of devices (e.g. transfer of
synthesizer-specific patch or setup information).

This deliverable defines an API supporting the MIDI protocol and MIDI
device input and output selection.  This API should be defined at such a
level as to enable non-music MIDI applications as well as music ones.


James also wrote:

> 1. Input and Output Devices
> These are essential. Implementing them would require a small amount of
> code, and enable a huge amount of functionality.
>

+1.


> 2. JavaScript Synthesizers
> Presumably these define their own APIs (which need not be MIDI). If a
> JavaScript synthesizer *is* a MIDI synthesizer, it should probably be
> allowed to say whether it wants to be included in the MIDI Output Devices
> enumeration or not. Some Javascript synthesizers will be controlled
> directly by other Javascript in the same file, and send their output
> directly to the sound system. Such synthesizers would not need to go via
> the MIDI API.
>

The more I've thought about this, the more I'm inclined to say this is a v2
extension of the API - the implications of communicating via virtual MIDI
ports to other running web applications are a little concerning.  Not that
I think it's impossible, or unattractive; just that I think it's a more
advanced use case, and I'd like to walk before we try to fly.


> 3. Standard MIDI Files
> There should be a standard way to convert an SMF into a queue of MIDI
> messages waiting to be sent. Playing an SMF means sending the pre-composed
> messages at the times they define. (Is there a potential memory problem
> here with large files?)
> The "standard way" could be JavaScript -- possibly supplied by the MIDI
> Manufacturer's Association -- or it could be code inside the browsers or
> operating systems themselves. That's something which would need discussing.
>

Exploring this need was why I whipped up a quick Javascript SMF reader a
couple of weeks ago ().  Once we have an API, I'm happy to write the rest
of the code that this needs to have it prepped as timestamped MIDI events,
and release it as a library; as I've grown convinced most use cases will
want direct control over SOMETHING during playback, I think releasing this
as an open source library rather than encoding it as a black-box API is
probably best.


> 4. Synthesizer-specific files
> As I understand it, these are simply Standard MIDI Files containing
> "System Exclusive" MIDI commands. So they don't need any special treatment
> in a W3C MIDI API. A "System Exclusive" message would be sent in the same
> way as any other MIDI message.
>

Yes, that's correct.  (Well, actually, SYSEX messages are sent slightly
differently, but that's all part of the MIDI spec - and there's no
"system-exclusive FILE format", it's just a standard MIDI file containing
sysex messages.  IIUC, anyway.)


> 5. MIDI Realtime messages
> MIDI devices are usually synchronized by allocating channels to them, and
> simply sending synchronous, channel-specific messages.
> But MIDI Realtime messages can also be used to synchronize output devices.
> These include the MIDI Clock message, which is usually sent repeatedly (if
> at all) at short time intervals (of the order of .02 milliseconds) . To
> keep memory requirements down, it would be better if MIDI Realtime messages
> were both generated and sent while the main message queue is being sent.
> Possibly in a different thread.
> So the API needs a separate interface to these:
> 1. Should Realtime messages be sent at all? (no, by default)
> 2. Which Realtime messages should be sent, and when?
> 3. etc.
>

There are 24 MIDI Clock messages in every quarter note.  So at 120bpm, a
Clock message is sent about every 20.8ms (.02 SECONDS). I would leave clock
timing to the standard API as just another MIDI message - which is what I
believe the Windows and Mac APIs do.

-Chris
Received on Wednesday, 29 February 2012 18:10:57 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 29 February 2012 18:11:00 GMT