W3C home > Mailing lists > Public > public-audio@w3.org > January to March 2012

Re: MIDI files and streams

From: Anthony Bowyer-Lowe <anthony@lowbroweye.com>
Date: Fri, 2 Mar 2012 17:45:43 +0000
Message-ID: <CAMCSOPXicsDQF6m=HMpcHcgubq5jsDmYAy99vGt3tV_xNmfHpw@mail.gmail.com>
To: philburk@mobileer.com
Cc: public-audio@w3.org
On 2 March 2012 17:27, Phil Burk <philburk@mobileer.com> wrote:

> The MIDI protocol itself does not have timestamps for messages. But
> timestamps have proven to be a very useful part of MIDI APIs. Without them,
> the timing of rendered MIDI is often very ragged.
> Capturing an accurate performance requires timestamping of incoming
> events. Playback of a performance, or the rendering of an generated
> performance (eg. drum boxes), both benefit greatly from timestamping.
> By using timestamps, a MIDI rendering program can wake up at leisurely
> intervals, 10-30 times per second. If it had to output its MIDI at the
> exact moment it was to be played then it would have to wake up hundreds of
> times per second with no scheduling jitter. The underlying synth can also
> process audio in batches, applying the MIDI messages at the appropriate
> point in the sample stream based on the timestamp.

Absolutely. Accurate, high-resolution event time stamping and scheduling is
a must-have for any professional audio API, plus simplifies and optimises
sequencer logic.

If the output API was in "immediate" mode then integrated setups would
exhibit even worst tempo jitter artefacts than normal due to JavaScript
execution time and driver latency in conjunction with MIDI's serial nature.
Quantised chords become strums and tight drum beats sloppy flams - effects
best explicitly applied when needed.

Received on Friday, 2 March 2012 17:46:26 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:49:58 UTC