W3C home > Mailing lists > Public > public-audio-dev@w3.org > November 2015

Re: Web MIDI Synths

From: Chris Wilson <cwilso@google.com>
Date: Mon, 30 Nov 2015 06:39:37 -0800
Message-ID: <CAJK2wqWJmV=_DRVdnzM9arDghs1T0h=GJkRfApJ4RmZF3DvHnQ@mail.gmail.com>
To: James Ingram <j.ingram@netcologne.de>
Cc: "public-audio-dev@w3.org" <public-audio-dev@w3.org>
On Mon, Nov 30, 2015 at 2:52 AM, James Ingram <j.ingram@netcologne.de>
wrote:

> Am 28.11.2015 um 15:07 schrieb Chris Wilson:
>
>> I'm not really sure what you mean by "standard" and "non-standard".
>>
> Sorry, I was trying too hard to be succinct. :-)  Here's what I really
> meant:
> Synth controls are either defined in the MIDI standard (e.g. "set pan
> position" [CC10/42]) or they are not (e.g. "set the waveform of oscillator
> 2"), and the API for Web MIDI Synths needs to allow for both categories.
> Gree's soundFont synthesizer only implements standard MIDI controllers,
> including the changing of GM presets. But it by no means implements them
> all.
> Most of the the controls that your (Chris') synth implements are not in
> the MIDI standard.
>

Um, that's not true.  Granted, I remap some:

CC1 ("mod wheel"): controls filter modulation amount
CC2 ("breath controller"): controls filter cutoff
CC7 ("volume") and CC11 ("expression"): controls filter Q
CC5 ("portamento time"), CC15 ("undefined") and CC73 ("sound attack time"):
controls overdrive
... and so on.  (Full list visible at
https://github.com/cwilso/midi-synth/blob/master/js/synth.js#L150-L204.)

In short, I set up CCs based on what my available keyboards that I would
likely demo on (e.g. my Alesis Vortex), rather than have a whole separate
mapping layer.  I'm not personally convinced you CAN come up with an
exhaustive list of "standard" CCs, since many of the MIDI ones don't even
apply for a lot of synthesizers (e.g. the aforementioned "portamento time").


> The prime need for this, imo, is to resolve what we need to expose as an
>> API...
>>
>
> Host applications need to know which controls are going to react to which
> messages, so I added a controls declaration to the synths' API.
>

Actually, I don't know that that is a MUST.  It's useful, but informative,
not a requirement.  If the synth I select doesn't respond to "celeste
level", I'm not sure that's catastrophic.


> I think that's a MUST. If the host sends a control message to a synth that
> hasn't implemented it, then the synth should throw an exception.
>

Throwing exceptions is a nuclear option.  I'd recommend a much softer way
of expressing "I don't support that".


> Exactly *how* the declaration is formulated needs standardizing: Should
> there be separate attributes for "controls" and "customControls"? How,
> exactly, should the custom control attributes be named?
>

Indeed, this is a challenging area, and one side of the core of defining a
virtual synth api.

Custom Controls:
> I'm rather sceptical of the standard MIDI controls API. It was designed in
> the 1980s for hardware devices. We now have 30 years more experience
> designing interfaces, and are talking about software. That's a different
> ball game.
> The standard includes the general "non-registered parameter" control [CC
> 99/96] that is supposed to allow for non-standard controls, but why should
> software have to implement that (and everything it entails -- the Data
> Button controls etc.) rather than just telling the host directly which
> controls it has implemented? It would be much more work, at programming-,
> load- and run-times, than just implementing the control, declaring it and
> using it.
> A similar situation exists with setting pitch wheel deviation. I think
> this was an oversight in the original standard, which (inefficiently)
> requires the host to send a sequence of "registered parameter" controls.
> Why should a software synth have to implement the "registered parameter"
> control? In fact Gree decided not to, and I think they were right. They
> just implemented a "set pitch wheel deviation" control. Much simpler for
> everyone, and much more efficient.
>

This is where we deviate entire.  I think it's a fantastically bad idea to
redo a standardized albeit slightly goofy mechanism just to make it
simpler, in a proprietary way.  If Gree had advocated for adding that to
the MIDI spec, great - but implementing a replacement in a proprietary way
is a very non-webby, non-standards way of doing things.


> Hardware may be stuck with the 1980s standard, but software has to look
> out for itself. As in Javascript, I think we should use MIDI's good parts,
> and deprecate the not-so-good parts. For 21st century software, I'd start
> by deprecating all the controls that are unnecessary and/or lead to
> inefficiency (e.g. "non registered parameter") and anything that has no
> precise meaning (e.g. "general purpose button 1"). Open to discussion, of
> course! :-)
>

In my opinion, attempting to reinvent (or revise) MIDI is not something we
should do.  The API exposed for virtual synthesizers may not be MIDI -
that's not a problem I'm attempting to take on right now - but I think
reinventing MIDI's parameter control is something that should be done, if
at all, in the MMA, not here.


> Issue #110 <https://github.com/WebAudio/web-midi-api/issues/110>, is
> originally asking about hardware in the browser's implementation of the Web
> MIDI API. The question may not be solvable there, but as it stands in my
> API for Web Audio implementations, software synths that support GM
> instruments declare a setSoundFont function. Those that don't don't. Note
> that supporting GM instruments does not mean that the whole MIDI standard
> is implemented.


The whole MIDI standard meaning CC list, or the whole GM standard (defined
set of CCs, sounds, etc)?  Because if you say you "support GM", I'm pretty
sure you need to support all of GM.


> (Apropos: This is supposed to be a forum for developers. Where are they
> all? Maybe we should be talking on the other list. After all we *are*
> talking about *implementing* (part of) the Web MIDI API. I think I'm going
> to dare a cross-posting... :-)
>

For reference - there are developers on this list (for example, I'd
categorize Joe as a developer).  I think it's completely fine to drum up
interest on the developer list, though I'd like to just encourage people to
discuss it here (or, preferably, in the GH issue on virtual synths).


> Issue #45 <https://github.com/WebAudio/web-midi-api/issues/45> doesn't
> seem to be a problem for Web Audio synths. They are just ordinary Web Audio
> applications that don't need any special ports. Looks to me as if browsers
> could close this issue too...
>

This issue is about the whole idea of "out of tab" ports - i.e. on the path
to running a synth in one "web process" that is used by others.  I think
it's quite relevant to both in-browser and out-of-browser WA platforms.


> As for Chrome's "decision" to ban the GS synth in Windows - that wasn't
>> really a decision.  It was crashing the browser process, without user
>> intervention.  I expect it will get re-enabled (issue #150 <
>> https://github.com/WebAudio/web-midi-api/issues/150>) if the user wants
>> it, but we can't let external code be run without the user being asked.
>> That said, I expect a Service-Workered virtual synth is going to be the
>> best pre-installed synth we can hope for.
>>
>

> That's really off-topic for this forum too. :-))


Not at all - it's a significant concern for implementers that they may be
inadvertently running third-party code in their process.


> 2. The Microsoft GS Synth is still working on Firefox+Jazz+WebMIDIAPIShim.
>

The difference there, btw, is that the user had to have personally
installed Jazz on their machine.  Like any other plugin, that may open
security holes.
Received on Monday, 30 November 2015 14:40:08 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:03:54 UTC