Re: Adding MIDI APIs to Audio WG Charter (was: MIDI enumeration (was: Re: getUserMedia use cases))

On Mon, Feb 6, 2012 at 10:12 PM, Chris Wilson <cwilso@google.com> wrote:

> After some brief discussion a month or so ago with Robert in this list, I
> can see some value in being able to have a "MIDI stream to audio stream"
> processor; I see this as a very similar use case to the software synth
> plugins offered for Steinberg's VST, Digidesign's RTAS, Apple's Audio
> Units, et al.


I agree, that's where the idea of the sampleTime attribute actually comes
from, as the virtual instrument APIs usually couple MIDI messages with
audio buffers, timestamped according to positions on the audio buffers as
samples.


> However, in the other use cases (capturing MIDI input as controllers to
> drive a standalone synth program, similar to the MIDI software synths I use
> on my iPad; building a software MIDI sequencer in Javascript to drive
> hardware MIDI devices) that seems like a terrible burden, and it would be
> more useful to have a low-level system API that is more akin to what's
> available on Windows, OSX, iOS and Java (and similar to PortMIDI - thanks
> for the pointer, Vilson - although it has a confusing concept of "stream",
> to me).  I'd much rather define the low-level infrastructure, which should
> be both familiar to any MIDI developer and straightforward to implement on
> top of the aforementioned platforms, and then define a stream interface if
> there's enough demand for a single VST-synth-plugin-like scenario.
>

I believe my proposal would offer the same low-level functionality as
PortMIDI does (and more), as well as integration with the audio APIs we're
working on.

Received on Monday, 6 February 2012 20:25:49 UTC