W3C home > Mailing lists > Public > public-audio@w3.org > January to March 2012

Re: Adding MIDI APIs to Audio WG Charter (was: MIDI enumeration (was: Re: getUserMedia use cases))

From: Jussi Kalliokoski <jussi.kalliokoski@gmail.com>
Date: Mon, 6 Feb 2012 17:19:47 +0200
Message-ID: <CAJhzemW4EuQQcjgVa410h8HjGjrspR+84+A=YgR0wMw=f2Tw2w@mail.gmail.com>
To: public-audio@w3.org
I think integration to media streams would be highly useful, for example if
the Media Streams Processing API had an array of the incoming MIDI data in
time sorted array, timestamped to sample positions in the current audio
buffer (possibly float values for higher resolution), and you could also
pass the data on, modified, similarly to audio behaviour.

But Media Streams are definitely not the only use case for MIDI, hence it
should be made available also outside Media Streams. I don't know how all
of this would come together, especially when you keep the virtual devices
in the picture. Needs some thought.

As for GM, I think that all MIDI output devices listed will suffice, then
let the end user decide which to use. Even some Windows systems lack GM, if
there is no soundcard, or some other problems (I remember having a Windows
configuration without GM at some point), on Linux TiMidity, Fluidsynth and
similar aren't usually installed by default, and it can be tedious to
detect if and which have been installed in some cases. On OS X, you have
better alternatives to GM anyway.

Jussi Kalliokoski
Official.fm Labs - Team Lead
Received on Monday, 6 February 2012 15:20:37 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 6 February 2012 15:20:38 GMT