W3C home > Mailing lists > Public > public-webevents@w3.org > October to December 2011

Re: Draft Updated Charter adding Mouse Lock and Gamepad

From: Robert O'Callahan <robert@ocallahan.org>
Date: Fri, 7 Oct 2011 11:16:14 +1300
Message-ID: <CAOp6jLbCrn-Z_GCQVk1CymeGfLxfT2=2J1GS8nf+0Mcyfen7Mg@mail.gmail.com>
To: Chris Wilson <cwilso@google.com>
Cc: Olli@pettay.fi, public-webevents@w3.org
On Fri, Oct 7, 2011 at 11:02 AM, Chris Wilson <cwilso@google.com> wrote:

> I confess, the multiple tracks topology eluded me.  I'm trying to
> understand the common case of "interface supports many audio track i/os, how
> do I select them, and what does that turn into in terms of Media Streams?"
>  Is there an example that uses the DAW scenario?
>

No, but you can figure out what it would look like using the track APIs
defined in Hixie's MediaStreams proposal, for example.

I mean if I write an algorithmic music generator that just wants to spit out
> a MIDI message stream, how do I create the output device, and what does the
> programming model for that look like?  I think if I'm doing this to output
> an audio stream, I write out to a binary blob (less ideal, but marginally
> workable for MIDI data) and then hook up the stream to an <audio> (which
> then routes it to the default audio device today).
>

You wouldn't go through a binary blob if you just want to play MIDI. Instead
you would have a JS MIDI synthesizer and connect its MediaStream output to
the input of an <audio> element.


> But I don't have a <midi> element to route the output to (and that has the
> same interface-selection needs as input).
>

You mean to route MIDI data to an output device that accepts MIDI directly?
For that, we'll need brand-new DOM API for selecting the output device. That
API could let you connect a MediaStream to the device.

(Emphasis mine.)  I agree, I think a significant value MediaStreams provide
> is synchronization and real-time processing immune from main-thread latency.
>  My argument has been that that is less important with MIDI, and I'm
> concerned about the complexity of the programming model that arises from
> this model - enough so that I was thinking along the same lines - generally
> it won't hurt to simply deliver MIDI controller input to the main thread as
> regular DOM events.  (In Windows, MIDI messages are pumped through a message
> loop, frequently on the same thread.)
>

Don't underestimate main-thread latency: poorly written Web apps (or Web
apps sharing a main thread with other, poorly written Web apps, which still
happens a lot in all browsers) can easily see tens of milliseconds of
latency, or even arbitrary amounts.

I don't really care though. MediaStreams could be used to process MIDI data,
and I think that would have some benefits, but if someone creates
main-thread-only MIDI APIs that wouldn't bother me.

Rob
-- 
"If we claim to be without sin, we deceive ourselves and the truth is not in
us. If we confess our sins, he is faithful and just and will forgive us our
sins and purify us from all unrighteousness. If we claim we have not sinned,
we make him out to be a liar and his word is not in us." [1 John 1:8-10]
Received on Thursday, 6 October 2011 22:16:43 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:03:54 UTC