Re: Should MIDIInput use a callback instead of events?

On Wed, Feb 20, 2013 at 3:26 AM, Chris Wilson <cwilso@google.com> wrote:

> On Tue, Feb 19, 2013 at 2:44 AM, Marcos Caceres <w3c@marcosc.com> wrote:
>
>> On 19/02/2013, at 10:16 AM, Dominic Cooney <dominicc@chromium.org> wrote:
>>
>> > 1. MIDIEvent does not specify a constructor, so it is not possible to
>>> programmatically create and dispatch a MIDIEvent. That complicates
>>> scenarios such as writing unit tests for MIDI input handlers.
>>> Yep, we have a bug for this. The Editor has been requesting additional
>>> use cases. See:
>>> https://github.com/WebAudio/web-midi-api/issues/1
>>>
>>> Be great if you could also comment there :)
>>>
>>
>> Sure. I will do that. It looks like the testing case is already discussed
>> to some extent.
>>
>> Yup.  This is an open issue.
>
>>  > 2. As noted in the spec, DOM4 Event objects have a lower-resolution
>> timeStamp. I think it is confusing to have two kinds of timestamps.
>>
>>>
>>> Yep, I raised the same issue - see discussion at:
>>> https://github.com/WebAudio/web-midi-api/issues/4
>>>
>>> But we didn't find a way around that.
>>>
>>
>> Not using DOM4 Events is one way around it.
>>
>>
>>> > 3. The Event interface [2] and EventTarget interface [3] bring with
>>> them a number of concepts that aren’t useful for this application: related
>>> targets (ie the currentTarget property), event phases, event propagation
>>> (perhaps), bubbling, cancelation, default handling and capturing.
>>>
>>> Right - despite not participating in a DOM tree, the eventing model
>>> still seems appropriate.
>>>
>>
>> I think we misunderstand each other; I think the DOM4 Event model seems
>> inappropriate because many concepts are meaningless to Web MIDI. However
>> this is admittedly true of other uses of DOM4 Events, such as WebSockets.
>> How significant is wiring up multiple listeners to a single MIDIInput?
>>
>> This is still an issue/under-described use case in the spec. I've also
>> raised this previously (i.e., are these objects singletons?)
>>
>
> There are still some details to work out here, yes.  I wanted to weigh in
> on the DOM4 Event-ness briefly-
>
> I'm not 100% tied to DOM4 Events.  I do think we need a listener pattern,
> however, because of the way MIDI works - there are multiple channels on a
> single Port, and controllers and notes may be handled by separate code
> modules (as well as sysex).
>
> For example - the keyboard controller on my desk would be represented by a
> single MIDIInput port - named "Nocturn keyboard".  It has (in addition to
> 25 piano-style keys) 8 drum pads and 8 continuous controller knobs.  The
> drum pads send their note on/offs on a separate channel than the piano keys
> - because there would be a different "synth module" responding to them,
> typically.  It's pretty easy to see how you would want to structure the
> code in a DAW to use event listeners to respond to multiple channels -
> otherwise, most applications would simply have to write their other
> listener pattern under the hood.
>
>  > 4. The style of MIDIInput implements EventTarget is not used by recent
>>> specs; these universally use X : EventTarget for some interface X. This old
>>> style of EventTarget implementation complicates implementation.
>>>
>>> This is my fault and I've been meaning to file a bug to address this, so
>>> thanks for bringing it up. The design of the API did not lend itself to
>>> support both EventTarget and MIDIPort because MIDIPort does not inherit
>>> from EventTarget. Generally, EventTarget sits at the bottom of the
>>> inheritance chain, but that would mean MIDIPort would become an
>>> EventTarget. We didn't want to make MIDIOutput unnecessarily become and
>>> EventTarget; hence, "implements" EventTarget on MIDIInput seemed the right
>>> way to go.
>>>
>>>  (btw ...it actually does not complicate the implementation - but you
>>> are correct in that it's a bit weird - but this is DOM3's fault because it
>>> kinda says to do this and it's what browsers have generally done). AFAIK,
>>> only FireFox seems to expose the EventTarget interface as per WebIDL (this
>>> is something recent) - other browsers don't yet expose EventTarget - like
>>> in Chrome, try window.EventTarget … or try searching the prototype chain of
>>> any object that inherits EventTarget and you will see it is not there
>>> (shock!, I know :)).
>>>
>>
>> I'm familiar with the implementation of EventTarget in Chrome :) Having
>> both "implements EventTarget" and ": EventTarget" complicates the
>> implementation of wrapping and unwrapping EventTargets. If all EventTargets
>> were at the root of the prototype chain, machinery for the JavaScript
>> wrappers of EventTargets could be simplified.
>>
>>
>> I agree. As I said, the issue is the current inheritance model. Maybe
>> MIDIInput : EventTarget, but implements MIDIPort?
>>
>
> Actually, with the reopening of the connect/disconnect events issue, I
> believe I'll just be moving EventTarget to MIDIPort: that is,
>
> interface MIDIPort : EventTarget {
> ...
> attribute EventHandler ondisconnect;
> ...
> }
> interface MIDIInput : MIDIPort {
> ...
> attribute EventHandler onmessage;
> ...
> }
> interface MIDIOutput : MIDIPort {
> ...
> }
>

This solution looks preferable to me--neat.

>  > One possible benefit of using DOM Events is that it permits wiring up
>>> multiple listeners. Is that an intended use case of onmessage?
>>>
>> No, that's what .addEventListener() is for. onmessage is just the
>>> catchall.
>>
>>
>> I think I wrote that line too quickly. I meant "is that the intended use
>> case of the events that are handled by the onmessage handler", etc.
>>
>> I guess the answer is yes.
>>
>
> +1.
>
>> Is it likely for two libraries to compete for a given MIDIInput object?
>> These are not global objects like window or document.
>>
>> Could be, if they are singletons. Still not clear if they are or not.
>> Certainly good candidates to be and I think I implemented them that way in
>> my own reference implementation.
>>
>
> Singletons or not; I can easily see wanting to use separate processors
> attached for CC messages and note on/offs.
>
>
>>  > I note that Web Audio’s ScriptProcessorNode has a roughly analogous
>>> case: processing buffers of binary audio data in onaudioprocess.
>>> ScriptProcessorNode has moved away from using DOM Events.. Given the
>>> problems using DOM Events for MIDIInput noted above, I think it would be
>>> better for MIDIInput to have a simple callback, perhaps something like:
>>>
>> Note that ScriptProcessorNode is pretty different - it's really unlikely
> that you would want to arbitrarily enable multiple "listeners" on a
> ScriptProcessNode, and you certainly wouldn't want to pay any complexity
> cost for that.
>

Thanks Chris (and Marcos in a previous message) for clarifying. I don't
know the domain so didn't know that multiple listeners were a common case.

Regards,

Dominic

Received on Wednesday, 20 February 2013 02:59:47 UTC