W3C home > Mailing lists > Public > public-audio@w3.org > January to March 2012

Re: MIDI files and streams

From: Phil Burk <philburk@mobileer.com>
Date: Fri, 02 Mar 2012 09:27:22 -0800
Message-ID: <4F5102FA.5080009@mobileer.com>
To: public-audio@w3.org
On 3/2/12 4:27 AM, James Ingram wrote:
> So I'll stick to my guns, and repeat that MIDI commands (in the MIDI
> protocol) don't use time stamps.

The MIDI protocol itself does not have timestamps for messages. But 
timestamps have proven to be a very useful part of MIDI APIs. Without 
them, the timing of rendered MIDI is often very ragged.

Capturing an accurate performance requires timestamping of incoming 
events. Playback of a performance, or the rendering of an generated 
performance (eg. drum boxes), both benefit greatly from timestamping.

By using timestamps, a MIDI rendering program can wake up at leisurely 
intervals, 10-30 times per second. If it had to output its MIDI at the 
exact moment it was to be played then it would have to wake up hundreds 
of times per second with no scheduling jitter. The underlying synth can 
also process audio in batches, applying the MIDI messages at the 
appropriate point in the sample stream based on the timestamp.

Timestamping has proven to be so useful in MIDI that the MMA is making 
timestamps part of the protocol in the new "HD Music" standard.

Note that using timestamps requires that one be able to query the 
current time. JavaSound supports timestamps on output but the time query 
did not work (at least in early versions) so a programer had no idea of 
what value to use for the timestamps.

For a MIDI rendering program, such as an SMF player, or a drum box, a 
program generally schedules events some time in the near future. The 
minimum advance time must be greater than the maximum scheduling jitter 
in order to smooth out the timing.

For software synths, it is possible to achieve accurate rendering of an 
SMF file without timestamps by processing an in-memory image of an SMF 
file during the audio rendering process. One can generate 64 frames of 
audio, then advance the SMF file to get more MIDI events, then generate 
more audio, and so on. This is how our company implemented an SMF 
ringtone player for mobile phones.

 > SMFs use the methods outlined by Anthony
> yesterday [2]. These generally consist of projecting a series of nominally
> regular ticks (of some sort) onto the other messages in the file.

SMF files contain a timestamp before each MIDI message. They are encoded 
as a duration between events in a track. Those timestamps are part of 
SMF and not part of the MIDI protocol. SMF does not use the MIDI Time 
Clock messages or other MIDI timing messages to describe time.

 > Apropos age: I'm not very interested in the Standard MIDI
 > File format.

SMF with General MIDI is still a good format for song interchange. There 
are better proprietary formats but SMF is still widely used. And there 
were a huge number of SMF files created when SMF polyphonic ringtones 
were popular. SMF is a good choice if you want to transmit a short song 
in a few KB. But SMF does not have to supported in the core web MIDI 
API. As long as the API supports timestamps, then it would be possible 
to write a good quality SMF player in JavaScript.

Phil Burk
Received on Friday, 2 March 2012 17:27:56 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:49:58 UTC