Re: Audio-ISSUE-105 (MIDI timestamp resolution): timestamps in MIDI should use High Resolution Time [MIDI API]

The reason I kind of like the idea of having the timestamps specified as
DOMHighResTimeStamps is that it will allow the accuracy to live outside the
spec, for example if in the future it somehow becomes desirable to have
more accuracy than double precision, the DOMHighResTimeStamp will probably
be updated by then to use a higher precision as well. Although I don't
think a use case for higher resolution than double will come along very
soon. Having the timestamps be related to the creation time of the
MIDIAccess is a very good idea actually, because it makes the problem of
accuracy deterioration a slightly smaller problem. We probably need to
introduce some method to get the current timestamp of the MIDIAccess as
well.

Cheers,
Jussi

On Fri, Jun 1, 2012 at 10:39 PM, Chris Wilson <cwilso@google.com> wrote:

> Gah!  yes, sorry, didn't hit reply-all. Only thing in Gmail I'm still not
> quite used to, somehow.
>
> Yes, I agree that it's not great to have so many different timestamp
> formats and reference points.  If the desire is to divorce from wallclock
> time, then I supposed we could do like audioContext does - from when
> MIDIAccess is created.  As written in Jussi's last edit, though, it's
> "current time" (unfortunately, the definition of what that means (ms since
> UNIX epoch) was removed).  I don't have strong feelings.  I mostly disliked
> DOMHighResTimeStamp because it's one more reference, for what is
> essentially a trivial thing (monotonically increasing, number of
> milliseconds, unrelated to wallclock time), but that spec is really defined
> for uses relating to Performance, so it's confusing to read as a solution
> for this problem.  I think we would need to define our own zero point.
>
> I like seconds just because I think if it's not integer anyway, it's
> easier for humans to think that way, but I don't care that strongly.  The
> newer MIDI interfaces in Windows, I note, use a longlong (64bit int) of
> units of 100ns (i.e. tenths of a microsecond, or 0.0001 milliseconds).  I
> think that is kind of confusing, personally.  Seconds are prevalent in the
> Web Audio API, but milliseconds (as ints) are common in other web
> programming APIs, so I could be okay with either.
>
> On Fri, Jun 1, 2012 at 12:17 PM, Adam Goode <agoode@google.com> wrote:
>
>> On Fri Jun 01 13:53:52 GMT-400 2012, Chris Wilson <cwilso@google.com>
>> wrote:
>>
>>> Well there you go - it's been quite a while since I wrote Windows code.
>>>  :)
>>>
>>>
>>> >The point of DOMHighResTimeStamp is that it is divorced from
>>> wallclock time.
>>>
>>> So is audioContext.currentTime.
>>>
>>>
>> Hmmm. It's not great to have so many different timestamp formats and
>> reference points. It does make sense for audioContext to have its 0 point
>> at its start time. And there is no "start time" for these raw MIDI events.
>> So deferring to page load time seems fine.
>>
>> But the units are different (seconds in float vs. milliseconds in
>> double), and that seems worth addressing.
>>
>>
>> (Did we drop off the public list with this thread?)
>>
>> Adam
>>
>> On Fri, Jun 1, 2012 at 10:49 AM, Adam Goode <agoode@google.com> wrote:
>>>
>>> On Fri, Jun 1, 2012 at 12:47 PM, Chris Wilson <cwilso@google.com> wrote:
>>> >
>>> > Although I'm not completely opposed to this change, I'd argue against
>>> the point that millisecond resolution is insufficient.  If using hardware
>>> MIDI ports, it takes approximately 1/4 of a millisecond to SEND a single
>>> byte of data - so it will take approximately 3/4 of a millisecond to simply
>>> transfer the data anyway - and the latency in processing at the other end
>>> is typically much, much higher than 1ms (I seem to recall around 4-7ms was
>>> not atypical for hardware synths, but can't find my reference ATM).
>>> >
>>>
>>> The issue is more of jitter, not of processing delay. Though 1ms seems
>>> totally sufficient to me, I could imagine issues with the single byte
>>> timing code (F8) getting some unwanted jitter. But the real win of
>>> this change is monotonicity.
>>>
>>> >
>>> > That said, of course, it's not a bad idea to future-proof better than
>>> that; many MIDI use cases will never actually see a 5-pin-DIN cable.
>>>  However,
>>> >
>>> > 1) I find the usage of DOMHighResTimeStamp very confusing, as it's
>>> deliberately chained to (in terms of "zero" point) to the Performance
>>> interface.  It doesn't seem to add any value to reference here, since it's
>>> simply a double; we would still need to provide a way to get system time in
>>> double units, as I don't think using the PerformanceTiming interface is the
>>> most intuitive thing to do.  Or suggest that people use Date.now() (even
>>> though it's millisecond-precision), which is livable, I suppose.  But we do
>>> need to define that.  I would recommend either a) using a double for number
>>> of milliseconds, and recommending people use Date.now, or b) (my
>>> preference) use a double to represent number of seconds, to be uniform with
>>> the Web Audio API.  I'm ambivalent about whether we use the same
>>> currentTime from the audioContext as WA or Date.now().
>>> >
>>>
>>> The point of DOMHighResTimeStamp is that it is divorced from wallclock
>>> time. All the MIDI implementations use this kind of time stamp (even
>>> Windows, read on).
>>>
>>>
>>> >
>>> > 2) I would absolutely recommend that we (similar to
>>> DOMHighResTimeStamp) explicitly state that implementations are allowed to
>>> have millisecond-only precision in their implementation.  The underlying
>>> system APIs on Windows are based in milliseconds, for example - unless
>>> they're building another API, the time stamps on MIM_DATA are in
>>> milliseconds. The underlying API on OSX is a bit harder to determine
>>> precision, but I think it is higher.
>>> >
>>>
>>> Actually the ONLY part of DirectMusic that is undeprecated (it
>>> disappeared briefly in Vista, then was replaced in a service pack) is
>>> high resolution monotonic MIDI timestamps:
>>>
>>> http://msdn.microsoft.com/en-us/library/ee416788(VS.85).aspx#ID4EFEAC
>>> http://support.microsoft.com/kb/943253
>>>
>>>
>>> So yes, we can specify that the timestamps might only have ms
>>> resolution, but I don't think it's really required.
>>>
>>>
>>> Adam
>>>
>>> >
>>> >
>>> >
>>> > On Fri, Jun 1, 2012 at 8:48 AM, Jussi Kalliokoski <
>>> jussi.kalliokoski@gmail.com> wrote:
>>> >>
>>> >> This issue is now pending review per
>>> https://dvcs.w3.org/hg/audio/rev/b78b7c5e906e .
>>> >>
>>> >>
>>> >> On Fri, Jun 1, 2012 at 6:22 PM, Jussi Kalliokoski <
>>> jussi.kalliokoski@gmail.com> wrote:
>>> >>>
>>> >>> Good catch, thank you! As I planned it, the timestamp should have
>>> been a floating point value, allowing for sub-millisecond precision, but
>>> actually DOMHighResTimeStamp is actually more fit fore this.
>>> >>> I will make the necessary changes to the spec.
>>> >>>
>>> >>> Cheers,
>>> >>> Jussi
>>> >>>
>>> >>>
>>> >>> On Fri, Jun 1, 2012 at 6:16 PM, Audio Working Group Issue Tracker <
>>> sysbot+tracker@w3.org> wrote:
>>> >>>>
>>> >>>> Audio-ISSUE-105 (MIDI timestamp resolution): timestamps in MIDI
>>> should use High Resolution Time [MIDI API]
>>> >>>>
>>> >>>> http://www.w3.org/2011/audio/track/issues/105
>>> >>>>
>>> >>>> Raised by: Adam Goode
>>> >>>> On product: MIDI API
>>> >>>>
>>> >>>> The current MIDI API specifies timestamp as a long representing
>>> "milliseconds from the UNIX Epoch".
>>> >>>>
>>> >>>> For MIDI applications, millisecond resolution is insufficient and
>>> can cause noticeable jitter.
>>> >>>>
>>> >>>> Using absolute wallclock time is also problematic, as it is subject
>>> to system clock skew.
>>> >>>>
>>> >>>> The MIDI timestamp should use High Resolution Time
>>> (DOMHighResTimeStamp), which solves these problems:
>>> >>>>
>>> http://dvcs.w3.org/hg/webperf/raw-file/tip/specs/HighResolutionTime/Overview.html
>>> >>>>
>>> >>>>
>>> >>>>
>>> >>>
>>> >>
>>> >
>>>
>>>
>>>
>

Received on Friday, 1 June 2012 20:03:26 UTC