Re: Adding MIDI APIs to Audio WG Charter (was: MIDI enumeration (was: Re: getUserMedia use cases))

Hello,

The way MIDI could be integrated inside MediaStream (specially Audio) 
seems relevant to me in a way we often use it:
In a lot of Max/MSP, PureData patches we developped, MIDI is used to 
'control' audio, so, that makes senses.

(On the other hand, Chris Rogers told me that having separed core 
functionnalities for audio, video .. work great (see mac CoreAudio, 
CoreGraphics ...)
and I'm wondering why we should have them inside same API (MIDI inside 
Audio API).
Some arguments pros : MIDI is for audio
Some arguments cons : MIDI is also use for video controls)

Regards

samuel


Le 04/02/12 01:32, Robert O'Callahan a écrit :
> BTW if we really need to be able to generate, process and play MIDI 
> tracks in real time in the browser, a good approach might be to add a 
> MIDI track type to MediaStreams. Then with very little extra API 
> surface we could extend getUserMedia to support capturing MIDI tracks, 
> extract recorded MIDI tracks using mediaElement.captureStream, extend 
> the Worker onprocessmedia event to support reading and writing MIDI 
> data, synthesize and play back by feeding the MediaStream into an 
> <audio> element, record MIDI using StreamRecorder, etc. This approach 
> would let you keep MIDI tracks in sync with other video and audio 
> sources (including other kinds of audio processing). (I'm assuming 
> MIDI tracks can be represented as a sequence of timestamped events; I 
> don't know much about MIDI!)
>
> Rob
> -- 
> "If we claim to be without sin, we deceive ourselves and the truth is 
> not in us. If we confess our sins, he is faithful and just and will 
> forgive us our sins and purify us from all unrighteousness. If we 
> claim we have not sinned, we make him out to be a liar and his word is 
> not in us." [1 John 1:8-10]

Received on Saturday, 4 February 2012 09:15:31 UTC