W3C home > Mailing lists > Public > public-audio-dev@w3.org > November 2015

Re: Web MIDI Synths

From: Chris Wilson <cwilso@google.com>
Date: Sat, 28 Nov 2015 06:07:06 -0800
Message-ID: <CAJK2wqUV-ZEBVExVFR1YsB4OqnAzvhBXHcYqMHwDkkTAHiL4rg@mail.gmail.com>
To: James Ingram <j.ingram@netcologne.de>
Cc: "public-audio-dev@w3.org" <public-audio-dev@w3.org>
I'm not really sure what you mean by "standard" and "non-standard".
 ("Standard MIDI controls"?)  My MIDI synth does, in fact, respond to a
number of MIDI continuous controllers (and polyphonic aftertouch, at one
point at least.)  At any rate, I would suggest avoiding that terminology
without carefully defining what standard you mean.  Unless you specifically
mean General MIDI/GS/XG or the like, the required set is pretty loose, and
being called "non-standard" is sure to make me bristle.  :)

At any rate, it's good to get people thinking about this.  I think that
this is the crux of virtual instrument support (issue #124
<https://github.com/WebAudio/web-midi-api/issues/124>, also likely
dependent on issue #45 <https://github.com/WebAudio/web-midi-api/issues/45> on
virtual MIDI ports).  The prime need for this, imo, is to resolve what we
need to expose as an API to describe a virtual (or real) instrument - e.g.
whether a given device supports general MIDI (issue #110
<https://github.com/WebAudio/web-midi-api/issues/110>), managing a "pipe"
of audio data from the virtual instrument (aforementioned #124), and of
course, this all needs to be done in Workers (issue #99
<https://github.com/WebAudio/web-midi-api/issues/99>, and issue #16 in Web
Audio <https://github.com/WebAudio/web-audio-api/issues/16>)

As for Chrome's "decision" to ban the GS synth in Windows - that wasn't
really a decision.  It was crashing the browser process, without user
intervention.  I expect it will get re-enabled (issue #150
<https://github.com/WebAudio/web-midi-api/issues/150>) if the user wants
it, but we can't let external code be run without the user being asked.
That said, I expect a Service-Workered virtual synth is going to be the
best pre-installed synth we can hope for.

On Sat, Nov 28, 2015 at 3:41 AM, James Ingram <j.ingram@netcologne.de>
wrote:

> Hi,
> A few months ago, Chrome's decision to ban the Windows GS Wavetable Synth
> highlighted the need for MIDI output devices that are neither hardware nor
> installed plugins. If they want to reach the widest possible audience, web
> application programmers can't expect their users to pre-install anything.
> Those of us programming such apps were left high and dry. The Windows GS
> Wavetable Synth was only available to Windows users anyway...
>
> In answer to this problem, I've adapted the code for two existing software
> synths so that they implement the Web MIDI API for Output Devices, and have
> embedded these in a host application [1], [2] to show that they actually
> work.
> The synthesizers can quite easily be lifted out and used by any web app
> that sends MIDI messages to an output device (see [3] and [4]).
>
> There are two main categories of Web MIDI Synth: Those that implement
> Standard MIDI Controls, and those that don't. I've adapted one of each.
>
> The Standard Web MIDI Synth (Sf2Synth1 [5]) is working fine, but really
> needs optimising by people who know more about programming the Web Audio
> API than I do. It also needs testing with other soundFonts...
> Loading times will always be a problem when working with soundFonts, but
> I''ve tried to mitigate the problem by separating the synth code from the
> soundFont. Host applications only need to load presets that they are
> actually going to use (e.g. a grand piano). It also has to be said, that
> browsers cache soundFonts, so applications start much faster the second
> time.
> Note that Standard Web MIDI Synths can be used interchangeably with the
> hardware devices provided by browser implementations of the Web MIDI API.
>
> The Non-Standard Web MIDI Synth (cwMIDISynth [6]).
> Software synths that implement the interface I've designed for this synth
> can be used in any web application. There are no restrictions on the
> controls the synth programmer can use, they just have to be declared
> properly. MIDI can control anything.
> So, if you have programmed a software synthesizer, and want it to be used
> in a web application, maybe you could implement the interface?
>
> Any discussion would be very welcome, of course.
>
> All the best,
> James
>
> [1] GitHub: https://github.com/notator/WebMIDISynthHost
> [2] Web page:
> http://james-ingram-act-two.de/open-source/WebMIDISynthHost/host.html
> [3] Simple demo:
> http://james-ingram-act-two.de/open-source/SimpleMIDISynthHost/host.html
> [4] Simple demo:
> http://james-ingram-act-two.de/open-source/SimpleSoundFontSynthHost/host.html
> [5] Adapted from gree's sf2synth.js at https://github.com/gree/sf2synth.js
> [6] Adapted from Chris Wilson's MIDI-Synth at
> https://webaudiodemos.appspot.com/midi-synth/index.html
>
>
>
Received on Saturday, 28 November 2015 14:07:35 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:03:54 UTC