Re: Adding MIDI APIs to Audio WG Charter (was: MIDI enumeration (was: Re: getUserMedia use cases))

Hello Robert,

On Mon, Feb 6, 2012 at 9:27 PM, Robert O'Callahan <robert@ocallahan.org>wrote:

> On Tue, Feb 7, 2012 at 5:28 AM, Jussi Kalliokoski <
> jussi.kalliokoski@gmail.com> wrote:
>
>> I put together a gist in the form of IDL of what MIDI in the browser
>> could look like, respective of MediaStreams Processing API and getUserMedia
>> API. Excuse my weak skills in IDL, I hope it suffices to introduce my idea.
>> It's also a bit incomplete, for example, it doesn't describe how to
>> actually push the midi stream to the MediaStreamProcessor, because I
>> haven't thought of a good way to do it yet. I also included an example
>> usage.
>>
>> https://gist.github.com/1752949
>>
>> Feedback appreciated.
>>
>
> To make sure you can get MIDI data for all input streams to a
> ProcessedMediaStream, I'd give each MediaInputBuffer an array of MIDIEvent
> objects comprising the MIDI data for the first MIDI track, and give
> ProcessMediaEvent a writeMIDI() method that lets you write an array of
> MIDIEvents to the stream output. Maybe it's necessary to support multiple
> output MIDI tracks for a single stream?
>

I agree. The writeMIDI was actually something I forgot to put there, but
had in mind initially. Also, if ProcessedMediaStream had a method to attach
a MIDI input to it, like myPMS.addStream(MIDIInputDevice), would be pretty
handy.

Multiple output channels are also handy, which actually reminds me that I
forgot that MIDI events have a channel as well. I'll have to fix that.
Also, the MIDIEvent interface name might be confusing, it's probably better
if it's called MIDIMessage.


> Also, ProcessMediaEvent already has an 'inputTime', which probably means
> the same thing as "sampleTime".
>

I'm not sure, what I meant with the sampleTime is the time stamp as samples
from the beginning of the current processed audio buffer, to help
synchronize the MIDI events with the audio.


> We also need to discuss whether main-thread access to MIDI data streams is
> worth having. Most of the time you want the latency and performance
> benefits of handling the data in a Worker, and it's trivial to have a
> worker send and receive data to the main thread if needed, so a dedicated
> main-thread API may not be needed.
>

I believe the main thread access is highly useful for cases that don't
include audio/video, like a virtual midi keyboard in the browser. That
said, the worker access is even more useful, but I think the routing
functionality should be controlled from the main thread, just like with
audio it is.


> There's a whole separate conversation to be had about how devices should
> be accessed and enumerated and how the UI could work. This is mostly an
> API-independent issue. In general it's probably best if the app can specify
> its requirements declaratively up front --- e.g., the sort of devices it
> wants to talk to --- and let the browser, interacting with the user, decide
> which devices to connect to (and remember that decision between runs). That
> reduces the amount of trust the user has to give the app.
>
> Rob
> --
> "If we claim to be without sin, we deceive ourselves and the truth is not
> in us. If we confess our sins, he is faithful and just and will forgive us
> our sins and purify us from all unrighteousness. If we claim we have not
> sinned, we make him out to be a liar and his word is not in us." [1 John
> 1:8-10]
>

Received on Monday, 6 February 2012 19:46:28 UTC