This is an archived snapshot of W3C's public bugzilla bug tracker, decommissioned in April 2019. Please see the home page for more details.
I'm wondering if we really need MIDIOutput and MIDinput to extend MIDIPort. It seems like a bit of overkill to have to add two more additional interfaces to the platform when all that is being added is "onmessage" and on "send()" by each interface respectively. Additionally, the "type" attribute of MIDIPort is already restricted to "input" or "output", hence it feels redundant to have a whole new interface to determine the type of an object (which is already part of the base class). I recommend just folding MIDIPOutput and MIDIIntput back onto MIDIPort. An port of type input simply does noting when send() is invoked, and MIDIPort of type output does not receive message events. Is there a particular use case I'm missing that is not already covered?
Just thought of something... maybe MIDIPort does not need to extend EventTarget, but rather MIDIInput needs to implement it instead. You will note that "window.EventTarget" is not actually an object in any browser (well, in Opera, Chrome, and Safari at least). That would solve the issue of having MIDIOutput expose .addEventListener() and friends.
To be clear, what I mean is: interface MIDIInput : MIDIPort { attribute EventHandler onmessage; }; MIDIInput implements EventTarget; And MIDIPort becomes just: interface MIDIPort{ ... }
That's essentially how it used to be, so sure. :) https://github.com/WebAudio/web-midi-api/commit/6f0ca73534d69f9f902bcffeda908f4d26834635
Batch-closing RESOLVED MIDI issues. Reminder: midi issues now tracked at https://github.com/WebAudio/web-midi-api/issues