- From: Chris Wilson <cwilso@google.com>
- Date: Thu, 6 Oct 2011 15:30:02 -0700
- To: robert@ocallahan.org
- Cc: Olli@pettay.fi, public-webevents@w3.org
- Message-ID: <CAJK2wqU=c1RF-_X=7ndbK-6AiwZ0-qUN81Pdq=+57UC+BCu2tQ@mail.gmail.com>
> > But I don't have a <midi> element to route the output to (and that has the >> same interface-selection needs as input). >> > > You mean to route MIDI data to an output device that accepts MIDI directly? > For that, we'll need brand-new DOM API for selecting the output device. That > API could let you connect a MediaStream to the device. > Yes, that's what I mean - and the selection of MIDI input and output devices are uniform problems. Don't underestimate main-thread latency: poorly written Web apps (or Web > apps sharing a main thread with other, poorly written Web apps, which still > happens a lot in all browsers) can easily see tens of milliseconds of > latency, or even arbitrary amounts. > True enough. I'm thinking of the simplicity of output more than I'm thinking of input. I'd hate to force it to be in another thread (or more to the point, for the developer to have to think about threading) to write simple MIDI apps. > I don't really care though. MediaStreams could be used to process MIDI > data, and I think that would have some benefits, but if someone creates > main-thread-only MIDI APIs that wouldn't bother me. To be clear - regardless of my desire to have simple APIs, I'm additionally intrigued by the idea of MIDI streams if only due to the scenario you mentioned yesterday - thinking of a use case like a JS-implemented synthesizer as a stream processor with MIDI as an input stream and audio as an output stream. That wasn't a "I'll go think about that, until you agree I'm right, and then I'll forget anything you were talking about." :)
Received on Thursday, 6 October 2011 22:30:29 UTC