W3C home > Mailing lists > Public > public-audio@w3.org > January to March 2012

Re: ACTION-33: MIDI charter proposal

From: James Ingram <j.ingram@netcologne.de>
Date: Thu, 01 Mar 2012 12:42:18 +0100
Message-ID: <4F4F609A.6020009@netcologne.de>
To: Anthony Bowyer-Lowe <anthony@lowbroweye.com>
CC: Chris Wilson <cwilso@google.com>, public-audio@w3.org

Hi Anthony, that was very helpful, and very quick! Thanks.

James

Am 01.03.2012 12:04, schrieb Anthony Bowyer-Lowe:
> Hi folks,
>
> As to be expected of such a long-lived protocol, MIDI actually 
> contains various mechanisms for synchronising multiple devices.
>
> A brief of overview (deeper study is left to the interested reader):
>
> MMC, MIDI Machine Control - based around controlling recording 
> devices. Start, stop, pause, rewind/forward, cueing, record 
> arming/punch in/punch out, locate, eject (!). This mechanism also 
> supports querying external devices for their "identity" (response data 
> is custom per manufacturer/device).
>
> MIDI beat clock - this is the most common device synchronisation 
> method, derived from good old analogue Sync 24.  Has basic start and 
> stop signals (there's a continue facility too but that is pretty rare 
> in the wild) and the running tick sent at 24 pulses per quarter note.
>
> MTC, MIDI Time Code - derived from SMPTE synchronisation. Sent in as 
> sysex in quarter frame intervals while the clock is running (can also 
> be sent whilst devices are paused for relocating the playback 
> positon). It notifies downstream devices of the current time in 
> minutes, seconds and frames (and has an internal 2 bit flag to define 
> just what frame definition is being used: 24, 25, 29.97, 30). Can 
> obviously be received "backwards" if the controlling device is rewinding.
>
> SPP, Song Position Pointer - like MTC but defining the current time in 
> terms of 16th note beats, and thus sent every 6 beat clocks. Song top 
> is defined as beat 0. SPP is never found in isolation, always in 
> conjunction with MIDI clocks as they are too infrequent to support 
> strong synchronisation.
>
>
> Obviously, all of these protocols are subject to a certain amount of 
> clock jitter depending on the MIDI data transmission rate and 
> saturation level of the serial communication line.
>
> With regard to any MIDI API specifications, supporting just MIDI beat 
> clock initially would satisfy most initial device inter-operation use 
> cases.
>
> You're right that the MIDI ticks you mention, James, are just for SMF 
> parsing purposes, being an integral multiplier of a set tempo and 
> allowing each message to indicate it's desired timing with an 
> tempo-independent integer. These ticks are never transmitted. Nor do 
> SMFs generally clocking messages, these being generated as needed by 
> the DAW/host as needed.
>
> I hope some of this is helpful/comprehensible to someone!
>
>
> Cheers,
> Anthony.
>
> On 1 March 2012 10:26, James Ingram <j.ingram@netcologne.de 
> <mailto:j.ingram@netcologne.de>> wrote:
>
>     Hi Chris,
>
>
>>     cues and clock signals to set the tempo, 
>
>     First, I'd like to say that I'm an expert on music notation. I
>     program a lot, using both SVG and MIDI, but I would not class
>     myself as a real expert in either of these fields. In both cases I
>     only use the bits I really need, and when programming MIDI, I use
>     the Sanford Libraries [1] to shield me from the lowest level of
>     the Windows MIDI API. That having been said, I've been reading up
>     on "MIDI Realtime messages"...
>
>     MIDI Clock messages seem to be the *only* MIDI messages to involve
>     the terms "tempo" and "quarter-note". These abstractions only
>     really mean anything if one is eventually going to stream the file
>     to a device like a drum machine whose patterns need to be
>     synchronized with the other messages in the file. If the drum
>     machine is set to play a bass drum every "quarter-note", then it
>     will play a beat on each 24th Clock message it receives [2]. The
>     "tempo" has to be set in the sending device. (Setting the "tempo"
>     sets the frequency with which the Clock messages will be sent --
>     so your API might have a function for doing that.)
>     But Clock messages are not the only way to store temporal
>     information in standard MIDI files. MIDI also defines Tick
>     messages (which occur once every 10 milliseconds, independently of
>     any "tempo"). Tick messages can be used to determine when to send
>     the other messages in an SMF, but it looks as if they are never
>     themselves sent to any device. Someone please correct me if I've
>     got this wrong.
>
>     Chris, I agree completely with your reservations about using the
>     MIDI term "Realtime" in this context. These are messages stored in
>     files, which are not necessarily ever sent in real time. MIDI
>     (like JavaScript) has a few historical quirks...
>     But I'm a bit wary of your use of the term "time-stamped". MIDI
>     expects most of its message types to be executed *immediately*
>     when they are received. They contain no temporal information.
>
>     So I'd like to re-phrase
>
>>     "It can also send cues, tempo and clock signals to synchronize
>>     sequencing tasks across devices, as well as encapsulate
>>     system-exclusive messages that only apply to particular brands of
>>     devices (e.g. transfer of synthesizer-specific patch or setup
>>     information)."
>     again, more simply, as follows:
>
>     "It can also send special messages for synchronizing devices, as
>     well as system-exclusive messages that only apply to particular
>     brands of device (e.g. transfer of synthesizer-specific patch or
>     setup information)."
>
>     I'd also like to change "electronic" to "digital" in the first
>     line. This is the Musical Instrument Digital Interface. There are
>     electronic musical instruments which do not support MIDI.
>
>     The Complete text would then be:
>
>         MIDI Device Communication
>
>         The MIDI protocol enables digital musical instruments,
>         controllers, computers and other devices to communicate and
>         synchronize with each other.  MIDI is not just the universally
>         accepted standard for digital musical instruments, it is also
>         used for show control, lighting and special effects control.
>
>         MIDI does not transmit audio signals: instead, it sends event
>         messages about musical notes and controllers such as volume,
>         vibrato and panning. It can also send special messages for
>         synchronizing devices, as well as system-exclusive messages
>         that only apply to particular brands of device (e.g. transfer
>         of synthesizer-specific patch or setup information).
>
>         This deliverable defines an API supporting the MIDI protocol
>         and MIDI device input and output selection.  This API should
>         be defined at such a level as to enable non-music MIDI
>         applications as well as music ones.
>
>     All the best,
>     James
>
>     [1] http://www.codeproject.com/Articles/6228/C-MIDI-Toolkit
>     (Actually I use version 4, but that's a detail.)
>
>
>     [2] Chris said:
>>     There are 24 MIDI Clock messages in every quarter note.  So at
>>     120bpm, a Clock message is sent about every 20.8ms (.02 SECONDS).
>>     I would leave clock timing to the standard API as just another
>>     MIDI message - which is what I believe the Windows and Mac APIs do.
>     You are right, of course. [blush] :-)
>     j
>
>
>     Am 29.02.2012 19:10, schrieb Chris Wilson:
>>     James, I'm going to flip the order of your mail to talk about the
>>     charter proposal first, hope you don't mind.
>>
>>     cues and clock
>>     signals to set the tempo,
>>
>>     On Wed, Feb 29, 2012 at 3:27 AM, James Ingram
>>     <j.ingram@netcologne.de <mailto:j.ingram@netcologne.de>> wrote:
>>
>>         Here's a possible revision of Chris' proposed text (quoted above)
>>
>>         MIDI Device Communication
>>
>>         The MIDI protocol enables electronic musical instruments,
>>         controllers,
>>         computers and other devices to communicate and synchronize
>>         with each other.
>>         MIDI is not just the universally accepted standard for
>>         digital musical
>>         instruments, it is also used for show control, lighting and
>>         special effects
>>         control.
>>
>>         MIDI does not transmit audio signals: instead, it sends event
>>         messages about
>>         musical notes and controllers such as volume, vibrato and
>>         panning. It can
>>         also send real-time synchronization messages and
>>         system-exclusive messages
>>         for particular brands of output device.
>>
>>         This deliverable defines an API supporting the MIDI protocol
>>         and MIDI device
>>         input and output selection.  This API should be defined at
>>         such a level as
>>         to enable non-music MIDI applications as well as music ones.
>>
>>
>>     I think this wording is totally fine, except I would avoid saying
>>     "real-time synchronization" - only because real-time has specific
>>     meaning in some contexts that may not apply in MIDI (given the
>>     physical transport for MIDI).  Also, the way that sentence is
>>     structured might be construed to imply that the synchronization
>>     is manufacturer-specific, which is not true.  I would change that
>>     sentence to read: "It can also send cues, tempo and clock signals
>>     to synchronize sequencing tasks across devices, as well as
>>     encapsulate system-exclusive messages that only apply to
>>     particular brands of devices (e.g. transfer of
>>     synthesizer-specific patch or setup information)."
>>     Complete text:
>>
>>         MIDI Device Communication
>>
>>         The MIDI protocol enables electronic musical instruments,
>>         controllers, computers and other devices to communicate and
>>         synchronize with each other.  MIDI is not just the
>>         universally accepted standard for digital
>>         musical instruments, it is also used for show control,
>>         lighting and special effects control.
>>
>>         MIDI does not transmit audio signals: instead, it sends event
>>         messages about musical notes and controllers such as volume,
>>         vibrato and panning. It can also send cues, tempo and clock
>>         signals to synchronize sequencing tasks across devices, as
>>         well as encapsulate system-exclusive messages that only apply
>>         to particular brands of devices (e.g. transfer of
>>         synthesizer-specific patch or setup information).
>>
>>         This deliverable defines an API supporting the MIDI protocol
>>         and MIDI device input and output selection.  This API should
>>         be defined at such a level as to enable non-music MIDI
>>         applications as well as music ones.
>>
>>
>>     James also wrote:
>>
>>         1. Input and Output Devices
>>         These are essential. Implementing them would require a small
>>         amount of code, and enable a huge amount of functionality.
>>
>>
>>     +1.
>>
>>         2. JavaScript Synthesizers
>>         Presumably these define their own APIs (which need not be
>>         MIDI). If a JavaScript synthesizer *is* a MIDI synthesizer,
>>         it should probably be allowed to say whether it wants to be
>>         included in the MIDI Output Devices enumeration or not. Some
>>         Javascript synthesizers will be controlled directly by other
>>         Javascript in the same file, and send their output directly
>>         to the sound system. Such synthesizers would not need to go
>>         via the MIDI API.
>>
>>
>>     The more I've thought about this, the more I'm inclined to say
>>     this is a v2 extension of the API - the implications of
>>     communicating via virtual MIDI ports to other running web
>>     applications are a little concerning.  Not that I think it's
>>     impossible, or unattractive; just that I think it's a more
>>     advanced use case, and I'd like to walk before we try to fly.
>>
>>         3. Standard MIDI Files
>>         There should be a standard way to convert an SMF into a queue
>>         of MIDI messages waiting to be sent. Playing an SMF means
>>         sending the pre-composed messages at the times they define.
>>         (Is there a potential memory problem here with large files?)
>>         The "standard way" could be JavaScript -- possibly supplied
>>         by the MIDI Manufacturer's Association -- or it could be code
>>         inside the browsers or operating systems themselves. That's
>>         something which would need discussing.
>>
>>
>>     Exploring this need was why I whipped up a quick Javascript SMF
>>     reader a couple of weeks ago ().  Once we have an API, I'm happy
>>     to write the rest of the code that this needs to have it prepped
>>     as timestamped MIDI events, and release it as a library; as I've
>>     grown convinced most use cases will want direct control over
>>     SOMETHING during playback, I think releasing this as an open
>>     source library rather than encoding it as a black-box API is
>>     probably best.
>>
>>         4. Synthesizer-specific files
>>         As I understand it, these are simply Standard MIDI Files
>>         containing "System Exclusive" MIDI commands. So they don't
>>         need any special treatment in a W3C MIDI API. A "System
>>         Exclusive" message would be sent in the same way as any other
>>         MIDI message.
>>
>>
>>     Yes, that's correct.  (Well, actually, SYSEX messages are sent
>>     slightly differently, but that's all part of the MIDI spec - and
>>     there's no "system-exclusive FILE format", it's just a standard
>>     MIDI file containing sysex messages.  IIUC, anyway.)
>>
>>         5. MIDI Realtime messages
>>         MIDI devices are usually synchronized by allocating channels
>>         to them, and simply sending synchronous, channel-specific
>>         messages.
>>         But MIDI Realtime messages can also be used to synchronize
>>         output devices. These include the MIDI Clock message, which
>>         is usually sent repeatedly (if at all) at short time
>>         intervals (of the order of .02 milliseconds) . To keep memory
>>         requirements down, it would be better if MIDI Realtime
>>         messages were both generated and sent while the main message
>>         queue is being sent. Possibly in a different thread.
>>         So the API needs a separate interface to these:
>>         1. Should Realtime messages be sent at all? (no, by default)
>>         2. Which Realtime messages should be sent, and when?
>>         3. etc.
>>
>>
>>     There are 24 MIDI Clock messages in every quarter note.  So at
>>     120bpm, a Clock message is sent about every 20.8ms (.02 SECONDS).
>>     I would leave clock timing to the standard API as just another
>>     MIDI message - which is what I believe the Windows and Mac APIs do.
>     You are right, of course. [blush] :-)
>>     -Chris
>>
>
>     James
>
>     -- 
>     www.james-ingram-act-two.de  <http://www.james-ingram-act-two.de>
>
>


-- 
www.james-ingram-act-two.de
Received on Thursday, 1 March 2012 11:43:01 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Thursday, 1 March 2012 11:43:02 GMT