W3C home > Mailing lists > Public > public-audio@w3.org > January to March 2012

Re: Adding MIDI APIs to Audio WG Charter (was: MIDI enumeration (was: Re: getUserMedia use cases))

From: James Ingram <j.ingram@netcologne.de>
Date: Mon, 06 Feb 2012 18:41:40 +0100
Message-ID: <4F3010D4.4040700@netcologne.de>
To: public-audio@w3.org
CC: Joseph Berkovitz <joe@noteflight.com>, Jussi Kalliokoski <jussi.kalliokoski@gmail.com>
Hi Jussi, Joe,

I don't yet have much experience using Javascript, but


looks like a good start from where I'm sitting. :-)

Something I don't understand / that needs thinking about: How, as a Javascript author, do
I get access to MIDI information being generated live by a MIDI _input_ device? Would be
wonderful if that could be arranged.

Joe, I understood Marat to be saying that JS-based wavetable synths _are_ a distinct
possibility. Presumably, browsers should use them wherever they are provided, rather than
the attached devices. But, as I said, Javascript is not my strong point. I take your
word for it that they work! :-)

Perhaps its also worth noting, that users may have post-production software such as Cubase
or Ableton installed, and that these can also be made accessible via devices which appear
in the user's MIDI output devices list.


Received on Monday, 6 February 2012 17:44:49 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:49:57 UTC