Re: Value of currentTime

Indeed.  https://github.com/WebAudio/web-audio-api/issues/12.


On Wed, May 28, 2014 at 5:47 AM, Joseph Berkovitz <joe@noteflight.com>wrote:

> I think Julia’s excellent question highlights a deeper issue with the spec
> and the API. It has already been noted in the WG but perhaps isn’t
> generally well understood, so I think it’s worth repeating here..
>
> There are a set of important use cases in which one is scheduling audio
> relative to some non-audio event reflecting a user’s action, such as a
> mouse click or other input gesture — for instance, in a game this could be
> a sound effect to be triggered in response to the user’s manipulating an
> object onscreen, perhaps ASAP or perhaps with some known delay.
>
> So while it is important to understand the update frequency of
> currentTime, it’s even more important to be able to understand the
> relationship between 1) the high-resolution time stamps in DOM events and
> AudioContext.currentTime, 2) the audio context time at which an audio
> signal is scheduled, and the audio time at which the corresponding signal
> emerges from the physical audio output device. Unfortunately at present the
> API doesn’t address these essential points.
>
> If this relationship could be understood via the API, the update frequency
> of currentTime might not be relevant. One would simply convert from DOM
> event time to audio time as needed.
>
> .            .       .    .  . ...Joe
>
> *Joe Berkovitz*
> President
>
> *Noteflight LLC*
> Boston, Mass.
> phone: +1 978 314 6271
> www.noteflight.com
> "Your music, everywhere"
>
> Here’s a relevant bit of cross-posting from the WG group:
>
> On Jan 22, 2013, at 8:24 PM, Chris Wilson <cwilso@google.com> wrote:
>
>  To synchronize those events, regardless of how an application defines its
> "master clock", in order to do proper synchronization (e.g. "schedule
> sending this MIDI message and starting this audio buffer at the same
> time"), as you said I think we'll need some form of "at this audio time, it
> was/is this system time."  Or perhaps there's some other way to do this -
> but I'm now hitting a wall in how to precisely synchronize events in both
> worlds.
>
>
> On Jan 22, 2013, at 8:58 PM, Srikumar Karaikudi Subramanian <
> srikumarks@gmail.com> wrote:
>
> For such synchronization to be possible, every subsystem needs to provide
> enough information to transform its timestamps to/from a common time
> coordinate system within a local time span. Performance.now() seems to be a
> reasonable candidate for this reference coordinate system. The MIDI clock
> may be able to use it directly, but the audio clock cannot do so (in
> general) and must therefore expose the mapping directly.
>
> A general API for this may look like audioTimeToDOMTime(t),
> DOMTimeToAudioTime(t), etc. Multiple subsystems can then capitalize on this
> convention to expose their own timing relationship - ex: some kind of
> network time stamp. So to translate an audio time stamp to a midi time
> stamp, you can do - DOMTimeToMidiTime(audioTimeToDOMTime(t)). These
> functions may not be "pure" in the sense that a given value of t would
> always produce the same result in a session, since they can also account
> for clock drift between the subsystems.
>
>
>

Received on Wednesday, 28 May 2014 16:14:42 UTC