Re: Draft Updated Charter adding Mouse Lock and Gamepad

On Thu, Oct 6, 2011 at 11:30 AM, Chris Wilson <cwilso@google.com> wrote:

> Hmm.  An intriguing idea, and I can see what you mean by thinking of MIDI
> synthesizers in JS as being essentially just a stream transform.  I had a
> few off-the-cuff concerns with that approach, based on my current
> understanding of Media Streams - can you help me understand these?
>
>    1. I'm somewhat confused by how the selection of sources would work in
>    practice, even for audio/video.  It seems from a quick review like it would
>    be a very Flash-esque "you have one audio input, one video input, one audio
>    output (although that's really just an <audio> element)" - i.e., there's no
>    multiple-input/multiple-output stream handling.  Or am I missing something?
>     Sorry, Media Streams is rather complex to grok, but I'm working on it.
>
>
MediaStreams will be able to support multiple tracks, including multiple
tracks of the same type.

My ProcessedMediaStream proposal doesn't currently support direct access to
multiple input tracks of the same type in a single input stream, for
simplicity. However we could add API to split out individual tracks into
separate streams, and feed those into a ProcessedMediaStream as separate
input streams. Likewise ProcessedMediaStream can't produce multiple output
tracks of the same type, but you could use multiple ProcessedMediaStreams
(sharing the same worker state, even) and merge their results using another
API to merge tracks from separate streams into a single stream. Or, we could
add support for processing multiple tracks directly. It depends on the
use-cases for multi-track processing; I don't understand those yet.


>    1. On that same topic, it would be quite rare to only have one MIDI
>    interface (i.e., one 16-channel source/sink) in a studio environment.  I
>    think I have around 16 at home (i.e. 16 MIDI ins, 16 MIDI outs - each with
>    MIDI's 16 channels).  I really don't want to pop a "please select which MIDI
>    interface to use" a la Flash's "this app wants to use your microphone and
>    camera" as implied by
>    http://www.whatwg.org/specs/web-apps/current-work/multipage/video-conferencing-and-peer-to-peer-communication.html#obtaining-local-multimedia-contentevery time I load my sequencer app.
>
>
The UA would want some kind of "remember this decision" (on a per-app basis)
UI.


>
>    1. Similarly, on the audio side I don't think single input stream is
>    going to cut it for DAW purposes; I have around 20 channels of audio on my
>    workstation at home (and it's very old and decrepit).
>
>
Probably the simplest approach would be to allow getUserMedia to return a
single MediaStream with 20 audio tracks, and make it easy to split those out
into separate streams if needed.


>    1. I'm concerned by the content model of different streams - i.e. how
>    easy it would be to make the MIDI stream object model expose typical short
>    messages and sysex messages, such that you can easily fire the key on/off
>    "events" that they represent.
>
>
I don't understand this one (I know nothing about MIDI).


>
>    1. There doesn't seem to be symmetry between input and output of audio
>    streams - or, really, the output object model is left to <audio>.  With
>    MIDI, output and input are the same kinds of messages, and (more
>    importantly) they will likely need to multiplex to the same number of
>    different places (i.e. a single-output-sink model would not work at all for
>    anything other than a General MIDI player).  At the very least, this seems
>    like it would only solve the input problem for MIDI - because the local
>    output models in Streams are currently just "sink it to an <audio> or
>    <video>."  Or am I misunderstanding?
>
>
I'm not sure what alternative outputs you need. Existing
MediaStreams-related proposals support recording to a (possibly compressed)
binary blob and streaming over the network via PeerConnection. We can add
new APIs that consume MediaStreams as needed.


>    1. I'm just slightly nervous by the general idea of treating processing
>    of MIDI like processing of audio, given that it's not a consistent stream of
>    temporal data in the same way as audio; it's instructions. (From
>    http://www.midi.org/aboutmidi/intromidi.pdf: "MIDI...is a system
>    that allows electronic musical instruments and computers to send
>    instructions to each other."  Maybe that's okay, but other than the single
>    scenario of implementing a JS synthesizer (an important one, obviously), I'd
>    suggest you could similarly apply the same logic and say game controllers
>    are incoming streams of instructions too.
>
>
I think any stream of real-time timestamped data could theoretically be
added as a MediaStream track type. I'm not sure it would make sense to
include game controller input streams though. In my view, the critical
things MediaStreams provide are synchronization and real-time processing
that's immune from main-thread (HTML event loop) latency. I think generally
it won't hurt to simply deliver game controller input to the main thread as
regular DOM events.

Rob
-- 
"If we claim to be without sin, we deceive ourselves and the truth is not in
us. If we confess our sins, he is faithful and just and will forgive us our
sins and purify us from all unrighteousness. If we claim we have not sinned,
we make him out to be a liar and his word is not in us." [1 John 1:8-10]

Received on Thursday, 6 October 2011 00:07:34 UTC