W3C home > Mailing lists > Public > public-audio@w3.org > January to March 2012

Re: Adding MIDI APIs to Audio WG Charter (was: MIDI enumeration (was: Re: getUserMedia use cases))

From: Jussi Kalliokoski <jussi.kalliokoski@gmail.com>
Date: Mon, 6 Feb 2012 22:28:19 +0200
Message-ID: <CAJhzemUfAPwD5004yGU0qXmEacvCBAi_R+zRrya1ck=r8T0MvA@mail.gmail.com>
To: Joseph Berkovitz <joe@noteflight.com>
Cc: Chris Wilson <cwilso@google.com>, Vilson Vieira <vilson@void.cc>, robert@ocallahan.org, James Ingram <j.ingram@netcologne.de>, public-audio@w3.org
On Mon, Feb 6, 2012 at 10:23 PM, Joseph Berkovitz <joe@noteflight.com>wrote:

> +1 on this approach. Let's get timestamped MIDI events in and out of the
> system as a primary goal.
>
> I think Jussi's proposal has the right kind of flavor. I didn't get why
> setInterval() was needed if the outgoing messages can have timestamps, but
> that's a detail.
>

I used setInterval() to simplify the example a bit, but the idea would be
that queueing timestamped events would also be possible. :)


> ...joe
>
> On Feb 6, 2012, at 3:12 PM, Chris Wilson wrote:
>
> After some brief discussion a month or so ago with Robert in this list, I
> can see some value in being able to have a "MIDI stream to audio stream"
> processor; I see this as a very similar use case to the software synth
> plugins offered for Steinberg's VST, Digidesign's RTAS, Apple's Audio
> Units, et al.
>
> However, in the other use cases (capturing MIDI input as controllers to
> drive a standalone synth program, similar to the MIDI software synths I use
> on my iPad; building a software MIDI sequencer in Javascript to drive
> hardware MIDI devices) that seems like a terrible burden, and it would be
> more useful to have a low-level system API that is more akin to what's
> available on Windows, OSX, iOS and Java (and similar to PortMIDI - thanks
> for the pointer, Vilson - although it has a confusing concept of "stream",
> to me).  I'd much rather define the low-level infrastructure, which should
> be both familiar to any MIDI developer and straightforward to implement on
> top of the aforementioned platforms, and then define a stream interface if
> there's enough demand for a single VST-synth-plugin-like scenario.
>
> On Mon, Feb 6, 2012 at 11:48 AM, Jussi Kalliokoski <
> jussi.kalliokoski@gmail.com> wrote:
>
>> Hi Vilson!
>>
>> In the context of a browser, I think it's more useful to have a more
>> advanced API that goes well together with the existing audio and video
>> APIs, to allow better latency synchronization, etc.
>>
>> Cheers,
>> Jussi
>>
>>
>> On Mon, Feb 6, 2012 at 9:43 PM, Vilson Vieira <vilson@void.cc> wrote:
>>
>>> Hi all,
>>>
>>> when I think in MIDI devices as input in the browser I just think in a
>>> wrapper API to a MIDI lib like PortMidi. Am I wrong?
>>>
>>> Cheers.
>>>
>>>
>>> 2012/2/6 Robert O'Callahan <robert@ocallahan.org>
>>>
>>>> On Tue, Feb 7, 2012 at 5:28 AM, Jussi Kalliokoski <
>>>> jussi.kalliokoski@gmail.com> wrote:
>>>>
>>>>> I put together a gist in the form of IDL of what MIDI in the browser
>>>>> could look like, respective of MediaStreams Processing API and getUserMedia
>>>>> API. Excuse my weak skills in IDL, I hope it suffices to introduce my idea.
>>>>> It's also a bit incomplete, for example, it doesn't describe how to
>>>>> actually push the midi stream to the MediaStreamProcessor, because I
>>>>> haven't thought of a good way to do it yet. I also included an example
>>>>> usage.
>>>>>
>>>>> https://gist.github.com/1752949
>>>>>
>>>>> Feedback appreciated.
>>>>>
>>>>
>>>> To make sure you can get MIDI data for all input streams to a
>>>> ProcessedMediaStream, I'd give each MediaInputBuffer an array of MIDIEvent
>>>> objects comprising the MIDI data for the first MIDI track, and give
>>>> ProcessMediaEvent a writeMIDI() method that lets you write an array of
>>>> MIDIEvents to the stream output. Maybe it's necessary to support multiple
>>>> output MIDI tracks for a single stream?
>>>>
>>>> Also, ProcessMediaEvent already has an 'inputTime', which probably
>>>> means the same thing as "sampleTime".
>>>>
>>>> We also need to discuss whether main-thread access to MIDI data streams
>>>> is worth having. Most of the time you want the latency and performance
>>>> benefits of handling the data in a Worker, and it's trivial to have a
>>>> worker send and receive data to the main thread if needed, so a dedicated
>>>> main-thread API may not be needed.
>>>>
>>>> There's a whole separate conversation to be had about how devices
>>>> should be accessed and enumerated and how the UI could work. This is mostly
>>>> an API-independent issue. In general it's probably best if the app can
>>>> specify its requirements declaratively up front --- e.g., the sort of
>>>> devices it wants to talk to --- and let the browser, interacting with the
>>>> user, decide which devices to connect to (and remember that decision
>>>> between runs). That reduces the amount of trust the user has to give the
>>>> app.
>>>>
>>>> Rob
>>>> --
>>>> "If we claim to be without sin, we deceive ourselves and the truth is
>>>> not in us. If we confess our sins, he is faithful and just and will forgive
>>>> us our sins and purify us from all unrighteousness. If we claim we have not
>>>> sinned, we make him out to be a liar and his word is not in us." [1 John
>>>> 1:8-10]
>>>>
>>>
>>>
>>>
>>> --
>>> Vilson Vieira
>>>
>>> vilson@void.cc
>>>
>>> ((( http://automata.cc )))
>>>
>>> ((( http://musa.cc )))
>>>
>>
>>
>
> ... .  .    .       Joe
>
> *Joe Berkovitz*
> President
>
> *Noteflight LLC*
> 84 Hamilton St, Cambridge, MA 02139
> phone: +1 978 314 6271
> www.noteflight.com
>
>
Received on Monday, 6 February 2012 20:28:49 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 6 February 2012 20:28:49 GMT