W3C home > Mailing lists > Public > whatwg@whatwg.org > July 2011

[whatwg] Timing API proposal for measuring intervals

From: Mark Callow <callow_mark@hicorp.co.jp>
Date: Fri, 08 Jul 2011 19:32:24 +0900
Message-ID: <4E16DCB8.1080805@hicorp.co.jp>


On 08/07/2011 11:54, James Robinson wrote:
> True.  On OS X, however, the CoreVideo and CoreAudio APIs are specified to
> use a unified time base (see
> http://developer.apple.com/library/ios/#documentation/QuartzCore/Reference/CVTimeRef/Reference/reference.html)
> so if we do end up with APIs saying "play this sound at time X", like Chris
> Roger's proposed Web Audio API provides, it'll be really handy if we have a
> unified timescale for everyone to refer to.
If you are to have any hope of synchronizing a set of media streams you
need a common timebase. In TV studios it is called house sync. In the
first computers capable of properly synchronizing media streams and in
the OpenML specification it was called UST (Unadjusted System Time).
This is the "monotonic uniformly increasing hardware timestamp" referred
to in the Web Audio API proposal. Plus ?a change. Plus ?a m?me. For
synchronization purposes, animation is just another media stream and it
must use the same timebase as audio and video.

Regards

    -Mark
Received on Friday, 8 July 2011 03:32:24 UTC

This archive was generated by hypermail 2.4.0 : Wednesday, 22 January 2020 16:59:34 UTC