[whatwg] Proposal: <audio> and text/event-stream

Hi All,

I'm exploring programmable MIDI, and would like to generate some
discussion.

Currently: <audio src="data:audio/midi;base64,...."></audio> is a valid
way of generating a MIDI file; and if the browser actually supports midi,
it can result in a playable stream.

Live midi requires a data stream.

I'd like to see <audio> work with postMessage
and <audio src> work with text/event-stream.

.....

Event Source is a good starting point...
http://dev.w3.org/html5/eventsource/

for something something like this to make sense.
<audio src="data:text/event-stream,">

But it gets strange when trying to add a mime type. needing discussion:
<audio src="data:text/event-stream,data%..audio/midi," id="livemidi">

postMessage is used in the JS API to send events:
document.onkeydown = function(e) {
   bytecode = keyboardEventToBytecode(e);
   document.getElementById('livemidi').postMessage(bytecode);
}

By using postMessage, the target could be an audio tag,
a web worker, or a web socket, and by implementing text/event-stream,
the browser could receive midi events from a web socket.

This is a proposal to extend the <audio> tag, and usage recommendation
for data:text/event-stream. This proposal follows current specs closely, 
and is not restricted to MIDI.

....

The full Web Workers is a bit heavy for this use case:
http://www.whatwg.org/specs/web-workers/current-work/

Mozilla has been drafing a raw audio API:
https://wiki.mozilla.org/Audio_Data_API
MIDI byte code is quite a bit smaller, and so typed arrays are not 
necessary.

Buffering issues are generally outside the scope of the W3C APIs.


-Charles

Received on Friday, 28 May 2010 17:31:19 UTC