W3C home > Mailing lists > Public > public-audio@w3.org > April to June 2012

Re: Timing limitations when programming MIDI with Javascript

From: Jussi Kalliokoski <jussi.kalliokoski@gmail.com>
Date: Tue, 5 Jun 2012 14:44:32 +0300
Message-ID: <CAJhzemXqG4TjVzcW5L5P59jhejU0EAXV1OoUi6x8ObAM_H7ZXg@mail.gmail.com>
To: James Ingram <j.ingram@netcologne.de>
Cc: Chris Wilson <cwilso@google.com>, public-audio@w3.org
On Tue, Jun 5, 2012 at 12:14 PM, James Ingram <j.ingram@netcologne.de>wrote:

> Hi Jussi, Chris,
>  Jussi: Garbage collection isn't necessarily a problem, since
>> implementations will probably just use JS wrappers for the messages and the
>> data will actually be stored in an underlying struct, and MIDI isn't
>> exactly one of the highest traffic protocols anyway.
> I was thinking of situations in which there have to be large numbers of
> messages in memory waiting to be sent (maybe tens or even hundreds of
> thousands of them). But there are probably strategies for minimizing the
> problem (see below).

This is actually where the timestamps shine. You can have a clock interval,
like 200 milliseconds, where you proceed reading a list of events and queue
the events that are going to occur in the following 200ms, and send them to
be played at respective times, without a need for individual setTimeouts
for each event which is very CPU-intensive, and not to mention that events
that are supposed to occur at the same time don't necessarily do so, maybe
due to GC or rendering or whatever is blocking the next timeout.

>     James: I've been investigating the limitations on timing in
>>    Javascript applications:
>>    1. According to the W3C standard, the delay set in setTimeout()
>>    will never be less than 4 milliseconds [2]. In practice, the lower
>>    limit can be much larger [3].
>>    2. The time interval in setInterval() is measured in milliseconds,
>>    so there are limits on the rates at which it can be used to send
>>    MIDI Clock events. (MIDI defines the rate at which MIDI Clocks are
>>    sent in *microseconds*. MIDI-Clocks are usually sent event every
>>    20 milliseconds or so.)
>> Jussi: Yes, this is exactly why timestamps are quite essential in a MIDI
>> API that will be in a JavaScript environment, they allow you to make the
>> messages happen as accurately as possible, i.e. ahead of time, so that you
>> don't have to rely on unreliable timing systems such as setTimeout().
> It wouldn't be the end of the world if we had to rely on setTimeout() and
> setInterval(). There are going to be limitations somewhere. Its just that
> we have to know where they are. (One writes differently for a xylophone
> than for a tuba.) But of course we want the highest accuracy we can get out
> of these machines.

Yeah sure, there's nothing stopping anyone from using these mechanisms, but
we want to allow accuracy for those who need it.

>  Chris: Actually, I was wondering if it would be possible to create arrays
>> of MIDIMessages, to buffer them up with timestamps.
> (see below)
>  Jussi: I'd go for letting the cows pave the path again...
> :-) and
>  When we have actual data on what kind of usage patterns arise we can
>> start thinking about things like this, imho.
> Okay you asked for it. :-)
> I'm programming a MIDI-player.
> This reads MIDI data from an SVG-MIDI score displayed in the browser (like
> the one at [1]), and sends it to a MIDI output device.
> The player has controls like a Flash player (stop, go, pause, set start
> position, set end position, goto start, goto end). There will also be
> channel filters to allow the user to select which channels (=voices) to
> play, and some kind of indicator in the score showing the current
> performance position while playing.
> The test score uses three MIDI channels, one per staff.

This looks cool! I can imagine we'll have a web application like MuseScore

> I'm still learning Javascript so, as an exercise which I knew was going to
> have to be re-written, I first tried the naive approach to see what would
> happen (I wanted to hear something):
> 1. read the MIDI data from the DOM
> 2. convert it to midiMessages
> 3. put all the messages in a single list of midiEvents with a  _delay_ in
> milliseconds between them.
> 4. send the list to midiOut using setTimeout().
> (a midiEvent is a list of midiMessages that are to be sent "at the same
> time". But note that they are actually sent in a known order, so patch
> change messages in the midiEvent should affect messages in the same channel
> lower down the midiEvent.)
> Result:
> a) It takes about 35 seconds for my machine (which is quite fast) to read
> the DOM and create the list of midiEvents. That can probably be improved as
> my Javascript improves, but reading the DOM is always going to take
> significant amounts of time.
> b) The performance was considerably slower than the reference mp3 (which
> is the conversion of a MIDI file I made using C#). Obviously this is
> because the midiEvent list contained lots of delays less than 4ms.

35 seconds is quite a long time to read a score (our JS audio codecs can
read complete songs in a matter of seconds), I suggest reading the score in
a worker thread using a JS-based DOM library to avoid the overhead that
might come with browser built-in DOM.

I'm currently working on the user interface, but will soon be tackling the
> MIDI again.
> My current plan is as follows:
> 1. Do the DOM reading and performance in a separate threads. (I haven't
> used, or even properly looked at Javascript Workers yet, so I don't know if
> this is going to work.) I imagine reading ca 5 seconds from the DOM, giving
> the messages to the performing thread, and going back to reading the DOM.
> When the performing thread is ready, the DOM thread gives it what its got
> and then goes back to reading.
> Note that reading the DOM and creating midiMessages are two separate
> operations. It might be better to let the performance thread do the
> midiMessage construction, so that the DOM thread has less to do. That would
> mean passing the midiMessage _parameters_ to the performing thread rather
> than the midiMessages themselves.
> 2. Chord symbols contain MIDI info which is fundamentally in two parallel
> streams:
> a) a stream containing ChordOns, ChordOffs and patch changes. (Chord
> symbols can represent ornaments.)
> b) a stream containing slider (controller) info.
> I want to try writing a MIDIChord.play(midiOut) function which would
> actually use two threads to send the two streams. This makes _delays_ much
> more controllable. There would not be an undue proliferation of Workers,
> because one only really needs two per channel.
> Its quite sufficient to send sliderEvents every 50ms or so (20 per
> second). Each sliderEvent can contain messages for all the sliders
> currently in use in the chord.
> I still don't see why I need to bring absolute time into this.  Delays are
> something else.  I'm quite happily using millisecond units, but you might
> have a good reason for wanting to use microseconds.

You don't have to, but I suggest you do so, it scales much better than
having lots of timeouts and such. And DOMHRTS *is* in milliseconds, but it
has the decimal part for higher accuracy, no one is forcing you to use
anything smaller than milliseconds, or even that. :)

Received on Tuesday, 5 June 2012 11:45:02 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:03:04 UTC