W3C home > Mailing lists > Public > public-audio@w3.org > January to March 2012

Re: Adding MIDI APIs to Audio WG Charter (was: MIDI enumeration (was: Re: getUserMedia use cases))

From: Robert O'Callahan <robert@ocallahan.org>
Date: Sat, 4 Feb 2012 13:32:23 +1300
Message-ID: <CAOp6jLaf5wRhgmdH4qOda_AiT05g2c-MRhZ55bgaXfXHb6fA2A@mail.gmail.com>
To: "Marat Tanalin | tanalin.com" <mtanalin@yandex.ru>
Cc: "Tom White (MMA)" <lists@midi.org>, Chris Wilson <cwilso@google.com>, Doug Schepers <schepers@w3.org>, Joseph Berkovitz <joe@noteflight.com>, Robin Berjon <robin@berjon.com>, public-audio@w3.org, Dom Hazael-Massieux <dom@w3.org>, jussi.kalliokoski@gmail.com
BTW if we really need to be able to generate, process and play MIDI tracks
in real time in the browser, a good approach might be to add a MIDI track
type to MediaStreams. Then with very little extra API surface we could
extend getUserMedia to support capturing MIDI tracks, extract recorded MIDI
tracks using mediaElement.captureStream, extend the Worker onprocessmedia
event to support reading and writing MIDI data, synthesize and play back by
feeding the MediaStream into an <audio> element, record MIDI using
StreamRecorder, etc. This approach would let you keep MIDI tracks in sync
with other video and audio sources (including other kinds of audio
processing). (I'm assuming MIDI tracks can be represented as a sequence of
timestamped events; I don't know much about MIDI!)

"If we claim to be without sin, we deceive ourselves and the truth is not
in us. If we confess our sins, he is faithful and just and will forgive us
our sins and purify us from all unrighteousness. If we claim we have not
sinned, we make him out to be a liar and his word is not in us." [1 John
Received on Saturday, 4 February 2012 00:33:01 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:49:57 UTC