Re: Should MIDIInput use a callback instead of events?

Hi Dominic!

On Tuesday, 19 February 2013 at 06:45, Dominic Cooney wrote:

> Greetings Web Audiophiles,
>  
> In reading the Web MIDI spec I note that MIDIInput is declared this way [1]:
>  
> MIDIInput implements EventTarget;
>  
> interface MIDIInput : MIDIPort {
> attribute EventHandler onmessage;
> };
>  
>  
> interface MIDIEvent : Event {
> readonly attribute double receivedTime;
> readonly attribute Uint8Array data;
> };
>  
>  
> There are a number of problems with these definitions.
>  
> 1. MIDIEvent does not specify a constructor, so it is not possible to programmatically create and dispatch a MIDIEvent. That complicates scenarios such as writing unit tests for MIDI input handlers.
Yep, we have a bug for this. The Editor has been requesting additional use cases. See:
https://github.com/WebAudio/web-midi-api/issues/1

Be great if you could also comment there :)  
>  
> 2. As noted in the spec, DOM4 Event objects have a lower-resolution timeStamp. I think it is confusing to have two kinds of timestamps.

Yep, I raised the same issue - see discussion at:  
https://github.com/WebAudio/web-midi-api/issues/4

But we didn't find a way around that.  
> 3. The Event interface [2] and EventTarget interface [3] bring with them a number of concepts that aren’t useful for this application: related targets (ie the currentTarget property), event phases, event propagation (perhaps), bubbling, cancelation, default handling and capturing.

Right - despite not participating in a DOM tree, the eventing model still seems appropriate.  
>  
> 4. The style of MIDIInput implements EventTarget is not used by recent specs; these universally use X : EventTarget for some interface X. This old style of EventTarget implementation complicates implementation.

This is my fault and I've been meaning to file a bug to address this, so thanks for bringing it up. The design of the API did not lend itself to support both EventTarget and MIDIPort because MIDIPort does not inherit from EventTarget. Generally, EventTarget sits at the bottom of the inheritance chain, but that would mean MIDIPort would become an EventTarget. We didn't want to make MIDIOutput unnecessarily become and EventTarget; hence, "implements" EventTarget on MIDIInput seemed the right way to go.

 (btw ...it actually does not complicate the implementation - but you are correct in that it's a bit weird - but this is DOM3's fault because it kinda says to do this and it's what browsers have generally done). AFAIK, only FireFox seems to expose the EventTarget interface as per WebIDL (this is something recent) - other browsers don't yet expose EventTarget - like in Chrome, try window.EventTarget … or try searching the prototype chain of any object that inherits EventTarget and you will see it is not there (shock!, I know :)).

I don't know how we "fix" this except to make MIDIPort inherit EventTarget. See also for a bit more discussion:
http://lists.w3.org/Archives/Public/public-audio/2012OctDec/0658.html

> One possible benefit of using DOM Events is that it permits wiring up multiple listeners. Is that an intended use case of onmessage?
No, that's what .addEventListener() is for. onmessage is just the catchall. Having a single onmessage is dangerous in that any object can easily override it (so naive library B can break library A - or both library A and B cannot simultaneously rely on onmessage - except if each instance of MIDIInput is unique, but then it just gets complicated). Using addEventListener protects from such conflicts and keeps things relatively simple and safe.
> I note that Web Audio’s ScriptProcessorNode has a roughly analogous case: processing buffers of binary audio data in onaudioprocess. ScriptProcessorNode has moved away from using DOM Events.. Given the problems using DOM Events for MIDIInput noted above, I think it would be better for MIDIInput to have a simple callback, perhaps something like:

I don't think the above are "problems" - or at least I'm not seeing/understanding explicit problems. The only thing you mentioned was that it "complicates implementation", but it would be great if you could qualify that with some explicit details (today's browsers don't have a problem with implementing EventTarget - all except FireFox do it, AFAIK). WebIDL implements, as I understand it, just says to copy the methods/attributes from one interface onto another without inheriting from it - behaviour stays the same.   
>  
> interface MIDIInput {
> attribute MIDIInputHandler? onmessage;
> }
>  
> callback MIDIInputHandler = void (double receivedTime, Uint8Array data);
I personally don't like the above, because (as a JS developer) it would mean having to build my own event dispatcher to then route everything from onmessage to other listeners - I also mentioned the potential for someone else's library to accidentally trash/replace my callback. It is also confusing in that if one finds an "onwhatever" IDL handler attribute on an interface, then it is an expectation that one would also be able to .addEventListener("whatever") on the same interface. Also, if this API grows in the future, it would mean stacking more arguments into the MIDIInputHandler callback (though you are correct in that MIDIEvent will contain a lot of useless garbage inherited from DOM's Event object).  
  
Kind regards,
Marcos  

Received on Tuesday, 19 February 2013 07:38:37 UTC