W3C home > Mailing lists > Public > public-webevents@w3.org > October to December 2011

Re: Draft Updated Charter adding Mouse Lock and Gamepad

From: Chris Wilson <cwilso@google.com>
Date: Wed, 5 Oct 2011 15:30:14 -0700
Message-ID: <CAJK2wqWBK7bAiz+ftuV3iSc0aMTar_nzZYL0YGGSi+phaftrxA@mail.gmail.com>
To: robert@ocallahan.org
Cc: Olli@pettay.fi, public-webevents@w3.org
Hmm.  An intriguing idea, and I can see what you mean by thinking of MIDI
synthesizers in JS as being essentially just a stream transform.  I had a
few off-the-cuff concerns with that approach, based on my current
understanding of Media Streams - can you help me understand these?

   1. I'm somewhat confused by how the selection of sources would work in
   practice, even for audio/video.  It seems from a quick review like it would
   be a very Flash-esque "you have one audio input, one video input, one audio
   output (although that's really just an <audio> element)" - i.e., there's no
   multiple-input/multiple-output stream handling.  Or am I missing something?
    Sorry, Media Streams is rather complex to grok, but I'm working on it.
   2. On that same topic, it would be quite rare to only have one MIDI
   interface (i.e., one 16-channel source/sink) in a studio environment.  I
   think I have around 16 at home (i.e. 16 MIDI ins, 16 MIDI outs - each with
   MIDI's 16 channels).  I really don't want to pop a "please select which MIDI
   interface to use" a la Flash's "this app wants to use your microphone and
   camera" as implied by
time I load my sequencer app.  Similarly, on the audio side I don't
   think single input stream is going to cut it for DAW purposes; I have around
   20 channels of audio on my workstation at home (and it's very old and
   3. I'm concerned by the content model of different streams - i.e. how
   easy it would be to make the MIDI stream object model expose typical short
   messages and sysex messages, such that you can easily fire the key on/off
   "events" that they represent.
   4. There doesn't seem to be symmetry between input and output of audio
   streams - or, really, the output object model is left to <audio>.  With
   MIDI, output and input are the same kinds of messages, and (more
   importantly) they will likely need to multiplex to the same number of
   different places (i.e. a single-output-sink model would not work at all for
   anything other than a General MIDI player).  At the very least, this seems
   like it would only solve the input problem for MIDI - because the local
   output models in Streams are currently just "sink it to an <audio> or
   <video>."  Or am I misunderstanding?
   5. I'm just slightly nervous by the general idea of treating processing
   of MIDI like processing of audio, given that it's not a consistent stream of
   temporal data in the same way as audio; it's instructions. (From
   http://www.midi.org/aboutmidi/intromidi.pdf: "MIDI...is a system
   that allows electronic musical instruments and computers to send
   instructions to each other."  Maybe that's okay, but other than the single
   scenario of implementing a JS synthesizer (an important one, obviously), I'd
   suggest you could similarly apply the same logic and say game controllers
   are incoming streams of instructions too.

I guess I'm having trouble understanding just how simple (or not) this would
look in practice, because the other practical uses of Streams are pretty
involved.  I'd like to make sure simple input/output scenarios aren't much
more involved, or with more overhead, than similar Windows/Mac MIDI
scenarios - because frankly, those are already pretty involved.

On Wed, Oct 5, 2011 at 2:00 PM, Robert O'Callahan <robert@ocallahan.org>wrote:

> My goal is to turn MediaStreams into a generic framework for real-time
> signal processing.
> http://hg.mozilla.org/users/rocallahan_mozilla.com/specs/raw-file/tip/StreamProcessing/StreamProcessing.html
> We could add a MIDI track type and add MIDI capture to
> navigator.getUserMedia. You'd also be able to load MIDI data using an HTML
> media element. Then you could write a MIDI synthesizer in JS, running in a
> Worker, taking raw MIDI data as input and generating output audio samples.
> Because MediaStreams maintain synchronization, you'd have that; for example
> if you had a MIDI track and a video track synchronized, they'd stay
> synchronized after sound synthesis. And you'd be able to mix MIDI output
> with other audio effects.
> Rob
> --
> "If we claim to be without sin, we deceive ourselves and the truth is not
> in us. If we confess our sins, he is faithful and just and will forgive us
> our sins and purify us from all unrighteousness. If we claim we have not
> sinned, we make him out to be a liar and his word is not in us." [1 John
> 1:8-10]
Received on Wednesday, 5 October 2011 22:30:41 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:03:54 UTC