W3C home > Mailing lists > Public > public-audio@w3.org > January to March 2012

Re: Adding MIDI APIs to Audio WG Charter (was: MIDI enumeration (was: Re: getUserMedia use cases))

From: Jussi Kalliokoski <jussi.kalliokoski@gmail.com>
Date: Mon, 6 Feb 2012 21:15:17 +0200
Message-ID: <CAJhzemXMUW6MkCJ1N=pqpNbE9Od9ynkbqYismLe-=c0AtoCNKA@mail.gmail.com>
To: James Ingram <j.ingram@netcologne.de>
Cc: public-audio@w3.org, Joseph Berkovitz <joe@noteflight.com>
I added two more examples, one for reading NoteOn and NoteOn events and
logging the note names for them, and another for creating a virtual MIDI
output with a simple JS sequencer to feed it data.

On Mon, Feb 6, 2012 at 7:41 PM, James Ingram <j.ingram@netcologne.de> wrote:

> Hi Jussi, Joe,
>
> I don't yet have much experience using Javascript, but
>
> https://gist.github.com/**1752949 <https://gist.github.com/1752949>
>
> looks like a good start from where I'm sitting. :-)
>
> Something I don't understand / that needs thinking about: How, as a
> Javascript author, do
> I get access to MIDI information being generated live by a MIDI _input_
> device? Would be
> wonderful if that could be arranged.
>
> Joe, I understood Marat to be saying that JS-based wavetable synths _are_
> a distinct
> possibility. Presumably, browsers should use them wherever they are
> provided, rather than
> the attached devices. But, as I said, Javascript is not my strong point. I
> take your
> word for it that they work! :-)
>
> Perhaps its also worth noting, that users may have post-production
> software such as Cubase
> or Ableton installed, and that these can also be made accessible via
> devices which appear
> in the user's MIDI output devices list.
>
> best
> James
>
> --
> www.james-ingram-act-two.de
>
>
>
Received on Monday, 6 February 2012 19:15:45 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 6 February 2012 19:15:45 GMT