Fwd: MIDI in Web Audio

Sent from my iPhone

Begin forwarded message:

*From:* Grant Galitz <grantgalitz@gmail.com>
*Date:* April 22, 2011 5:35:23 PM EDT
*To:* Alistair Macdonald <al@bocoup.com>
*Subject:* *Re: MIDI in Web Audio*

Even better, just implement MIDI in a JS Lib that uses mozAudio and web
audio's javascriptnode. Synthesize the audio in js and implement the server
in JS as well. No reason it can't be done. I implemented the gameboy's audio
chip in js and was able to do it all in the browser without external
transfers or plugins, so definitely reachable to make a fully MIDI compliant
audio server in Js.

Sent from my iPhone

On Apr 22, 2011, at 5:04 PM, Alistair Macdonald <al@bocoup.com> wrote:

So MIDI messages go over HTTP? Isn't latency pretty bad that way?

I have done this before:

USB MIDI > JAVA > Apache & PHP > JavaScript

I was using it to turn CC dials in to control a 3D landscape. The latency
was very good locally (sub milliseconds). But very  inconsistent across the
internet (as expected) which was sometimes quick but usually too slow to be
called "real-time".

Obviously HTTP MIDI is no kind of long-term solution and as Chris Rogers
points out, the device spec is probably the best way forward. It would also
be good to see someone develop MIDI-based browser plugins to test how such
things might work in the future, such as device recognition,
master-controller assignment, channel routing etc.

Even so Olli, I wanted you to be aware that a local HTTP-based MIDI
interface was in fact fast enough for a real-time music performance. The
latency was typically around a millisecond, which is actually 32 times
faster than my software synthesizer can turn my MIDI data into sound. :)

-- Alistair

On Fri, Apr 22, 2011 at 3:12 PM, Vilson Vieira <vilson@void.cc> wrote:

> Hi Olli and Chris,
> 2011/4/22 Chris Rogers < <crogers@google.com>crogers@google.com>
>> Olli,
>> I think people are just experimenting with getting MIDI events into the
>> browser any way they can right now.  If we add browser support for it, then
>> of course we can bypass the HTTP latency.
> yes, it is the point. We have latency on HTTP but IMHO for now it is the
> best we can do.
>> The MIDI events from the OS will come in on their own thread, which can
>> then be dispatched to a JS event listener.  Of course, this is assuming that
>> a proposal for a MIDI JavaScript API is crafted and implemented.
> I'd like to help on this. I'm thinking if it could be related with the work
> of Device APIs and Policy WG [1] or Webinos Discovery Plugin [2].
> Cheers.
> PS : I forgot to include the reference to osc-web so here it is
> <http://automata.cc/osc-web>http://automata.cc/osc-web
> [1] <http://www.w3.org/2009/dap/>http://www.w3.org/2009/dap/
> [2] <http://www.w3.org/2011/04/discovery.html>
> http://www.w3.org/2011/04/discovery.html
> --
> Vilson Vieira
> vilson@void.cc
> ((( http://automata.cc )))
> ((( http://musa.cc )))

Received on Friday, 22 April 2011 21:36:07 UTC