W3C home > Mailing lists > Public > public-audio@w3.org > April to June 2012

Re: Audio-ISSUE-105 (MIDI timestamp resolution): timestamps in MIDI should use High Resolution Time [MIDI API]

From: Chris Wilson <cwilso@google.com>
Date: Fri, 1 Jun 2012 09:47:10 -0700
Message-ID: <CAJK2wqVnzAVN2UwhyV4rVoBVaX+BQuZVy82HZpRzjCQWkHZQzg@mail.gmail.com>
To: Jussi Kalliokoski <jussi.kalliokoski@gmail.com>
Cc: Audio Working Group <public-audio@w3.org>
Although I'm not completely opposed to this change, I'd argue against the
point that millisecond resolution is insufficient.  If using hardware MIDI
ports, it takes approximately 1/4 of a millisecond to SEND a single byte of
data - so it will take approximately 3/4 of a millisecond to simply
transfer the data anyway - and the latency in processing at the other end
is typically much, much higher than 1ms (I seem to recall around 4-7ms was
not atypical for hardware synths, but can't find my reference ATM).

That said, of course, it's not a bad idea to future-proof better than that;
many MIDI use cases will never actually see a 5-pin-DIN cable.  However,

1) I find the usage of DOMHighResTimeStamp very confusing, as it's
deliberately chained to (in terms of "zero" point) to the Performance
interface.  It doesn't seem to add any value to reference here, since it's
simply a double; we would still need to provide a way to get system time in
double units, as I don't think using the PerformanceTiming interface is the
most intuitive thing to do.  Or suggest that people use Date.now() (even
though it's millisecond-precision), which is livable, I suppose.  But we do
need to define that.  I would recommend either a) using a double for number
of milliseconds, and recommending people use Date.now, or b) (my
preference) use a double to represent number of seconds, to be uniform with
the Web Audio API.  I'm ambivalent about whether we use the same
currentTime from the audioContext as WA or Date.now().

2) I would absolutely recommend that we (similar to DOMHighResTimeStamp)
explicitly state that implementations are allowed to have millisecond-only
precision in their implementation.  The underlying system APIs on Windows
are based in milliseconds, for example - unless they're building another
API, the time stamps on
MIM_DATA<http://msdn.microsoft.com/en-us/library/dd757284(v=vs.85)>are
in milliseconds. The underlying API on OSX is a bit harder to
determine
precision, but I think it is higher.


On Fri, Jun 1, 2012 at 8:48 AM, Jussi Kalliokoski <
jussi.kalliokoski@gmail.com> wrote:

> This issue is now pending review per
> https://dvcs.w3.org/hg/audio/rev/b78b7c5e906e .
>
>
> On Fri, Jun 1, 2012 at 6:22 PM, Jussi Kalliokoski <
> jussi.kalliokoski@gmail.com> wrote:
>
>> Good catch, thank you! As I planned it, the timestamp should have been a
>> floating point value, allowing for sub-millisecond precision, but actually
>> DOMHighResTimeStamp is actually more fit fore this.
>> I will make the necessary changes to the spec.
>>
>> Cheers,
>> Jussi
>>
>>
>> On Fri, Jun 1, 2012 at 6:16 PM, Audio Working Group Issue Tracker <
>> sysbot+tracker@w3.org> wrote:
>>
>>> Audio-ISSUE-105 (MIDI timestamp resolution): timestamps in MIDI should
>>> use High Resolution Time [MIDI API]
>>>
>>> http://www.w3.org/2011/audio/track/issues/105
>>>
>>> Raised by: Adam Goode
>>> On product: MIDI API
>>>
>>> The current MIDI API specifies timestamp as a long representing
>>> "milliseconds from the UNIX Epoch".
>>>
>>> For MIDI applications, millisecond resolution is insufficient and can
>>> cause noticeable jitter.
>>>
>>> Using absolute wallclock time is also problematic, as it is subject to
>>> system clock skew.
>>>
>>> The MIDI timestamp should use High Resolution Time
>>> (DOMHighResTimeStamp), which solves these problems:
>>>
>>> http://dvcs.w3.org/hg/webperf/raw-file/tip/specs/HighResolutionTime/Overview.html
>>>
>>>
>>>
>>>
>>
>
Received on Friday, 1 June 2012 16:47:41 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Friday, 1 June 2012 16:47:45 GMT