Re: Exposing a playbackPosition property on AudioBufferSourceNode

Mixer thread is the web audio processing thread, yes. It mixes streams
together and sends audio to the sound card. Sorry for not knowing the
terminology - didn't mean to be confusing.

The problem I'm describing is that since all time measurement and
arithmetic occurs on the main JS thread, JS can end up stalling due
to something like a GC after it records currentTime or after
currentTime is synchronized with the processing thread, at which point
any attempt to synchronize two playing tracks, or record the current
playback position in order to pause playback, will fail because your
clocks are out of sync. Resuming that paused track will play a snippet
of audio the user already heard, and the synchronized tracks will be
out of sync because of that clock drift (probably doubly so, because
it's possible that the processing thread will get further ahead by the
time it gets the command asking it to start playing the new
synchronized track).

It's unfortunate to hear that start() starts 'when it can' at its
initial start position, because that thoroughly breaks any attempt at
synchronizing a currently playing track, unless you start it way in
the future to be sure that the processing thread won't beat you to
that time. I think the 'start when you can' behavior is great if no
'when' is specified for starting playback. I guess this means that any
scenarios involving precise timing need to have commands issued to the
AudioContext something like 250ms in advance, to ensure that the
processing thread never gets ahead of you?

Both cases I describe are fairly simple race conditions, though, so I
think they could be fixed - I just don't know exactly how you would do
that with the API at present. Because we can't stall the processing
thread until the JS thread is ready, it's inevitable that the
AudioContext.currentTime value will get a bit behind the processing
thread, even if it's somewhat rare. In scenarios with more GC pressure
it seems likely that users might experience this with time gaps big
enough to be noticeable.

On Wed, Feb 26, 2014 at 11:09 AM, Chris Wilson <cwilso@google.com> wrote:
> ?  I'm confused by what you mean by "mixer thread".  Do you mean the Web
> Audio processing thread?  The processing thread is where accurate changes
> need to happen, of course - but your "pause/resume" API would be called from
> your code, in the main JS thread, which schedules most changes so they CAN
> happen at precise times.  The Web Audio processing thread *is* the master
> clock
>
> Yes, it's possible to precisely synchronize two tracks in time; the
> scheduler in Web Audio is extremely precise.  The start() will not, by
> default, "make up" if it is scheduled too late by skipping time, it will
> just start when it can ("if the value is less than currentTime, then the
> sound will start playing immediately." - should probably edit that to say
> "at its initial start position".)
>
>
> On Tue, Feb 25, 2014 at 10:04 AM, K. Gadd <kg@luminance.org> wrote:
>>
>> Note that this is a scenario where the explicit pause/resume APIs that
>> have been requested before would enable a more accurate implementation
>> of pausing/resuming playback, because the mixer thread would be able
>> to accurately pause playback at whatever the current time is by the
>> time it receives the command from JS. As-is, any attempt to
>> pause/resume will naturally introduce some amount of drift in cases
>> where the mixer thread gets ahead of JS (after JS has recorded the
>> current time) or where the JS is ahead of the mixer thread (because
>> we've optimistically set currentTime somewhere in the future to try
>> and keep ahead of the mixer thread). In either case there will be
>> discontinuities in playback (though hopefully they will be small
>> enough that the user won't notice them)
>>
>> It is at least possible to do explicit synchronization since you can
>> pass an exact time when starting playback, so that's good. However, if
>> I have one track already playing and I want to align a second track
>> with it, is that possible given the current API and mixer guarantees?
>> Isn't it possible for the mixer to have gotten ahead of currentTime,
>> meaning that the new track will end up misaligned? Or will the mixer
>> instead try to satisfy the start time I provided and skip the first
>> couple dozen milliseconds of the new track?
>>
>> On Mon, Feb 24, 2014 at 7:39 PM, Chinmay Pendharkar
>> <chinmay.pendharkar@gmail.com> wrote:
>> > Thanks Chris.
>> >
>> > I will try counting time. I'm guessing I'll have to somehow hook all
>> > AudioParam methods and pre-calculate the index will be at the end of
>> > each
>> > Processing Event.
>> >
>> > OK. Agreed that sample accurate indices are not really going to be
>> > useful. I
>> > understand the conflict between knowing the 'true' position of playback
>> > and
>> > using it to do any kind of processing (especially on the JavaScript
>> > side).
>> >
>> > But it will be useful to know where the playback stopped after a "stop"
>> > call. Or a rough estimate of the index could be useful to do some types
>> > of
>> > synchronisation between two tracks (start the second one from where the
>> > first left off, etc).  And since that index is already tracked
>> > internally,
>> > it make sense to expose it instead of tracking and calculating it again
>> > outside in JavaScript.
>> >
>> > On a related note, would such a readonly parameter also be useful on the
>> > OscillatorNode as well? Especially when used with a user provided
>> > PeriodicWave. I haven't used PeriodicWaves much so I can't think of a
>> > use
>> > case immediately, but SourceBufferNode with .loop set to 'true' and a
>> > PeriodicWave do seem very similar in their working so it might be
>> > applicable
>> > to the OscillatorNode as well.
>> >
>> > I will file an issue on the spec later today.
>> >
>> > Thanks again.
>> >
>> > -Chinmay
>> >
>> >
>> >
>> > On Tue, Feb 25, 2014 at 2:17 AM, Chris Wilson <cwilso@google.com> wrote:
>> >>
>> >> Hey Chinmay,
>> >>
>> >> Although I've specifically asked for this feature before - notably, in
>> >> playing around with DJ software on Web Audio, I found this would have
>> >> been
>> >> really useful - because exactly as you said, it gets challenging to
>> >> keep
>> >> track of current time when playbackrate keeps changing (I *do* keep
>> >> track in
>> >> that app, even through linear ramps of playback rate, but the math
>> >> starts
>> >> getting more challenging when you use log ramps).
>> >>
>> >> However, I would point out that you're not likely to have
>> >> "sample-accurate
>> >> sample indices" - just because while you're off twiddling with
>> >> something,
>> >> the current time may change.  Also, you can't really hack at looping
>> >> like
>> >> this, for the same reason we don't can't really have an efficient
>> >> "loopEnded" event - because it may happen at far too high a frequency
>> >> for us
>> >> to fire DOM events and expect results.  In keeping with the rest of the
>> >> API,
>> >> of course, this should be a time not a sample index.
>> >>
>> >> We could expose something that would be useful as you suggested - to
>> >> know
>> >> where the playback was when you called "stop", or to even to schedule
>> >> ahead
>> >> - but I'd caution against it being a panacea.  I'd also suggest that
>> >> this
>> >> should NOT be a writeable parameter, and certainly should not be an
>> >> AudioParam - it would conflict with playbackRate then.
>> >>
>> >> Would you like to file an issue on the spec?
>> >> https://github.com/WebAudio/web-audio-api/issues
>> >>
>> >> -Chris
>> >>
>> >>
>> >> On Fri, Feb 21, 2014 at 12:51 AM, Chinmay Pendharkar
>> >> <chinmay.pendharkar@gmail.com> wrote:
>> >>>
>> >>> Hello,
>> >>>
>> >>> Let me first thank everyone in this group for all the work they've put
>> >>> into the WebAudioAPI. I'm must say I'm enjoying using the WebAudioAPI
>> >>> both
>> >>> as a developer and consumer.
>> >>>
>> >>> I'm trying to implement a interactive effect/synthesis library using
>> >>> the
>> >>> WebAudioAPI, and I found myself looking for a `playbackPosition`
>> >>> property on
>> >>> the AudioBufferSourceNode which exposes the sample index which is
>> >>> being
>> >>> played currently.
>> >>>
>> >>> A simple use case would be to implement a pause like functionality on
>> >>> AudioBufferSourceNode, which would allow the playback to resume at the
>> >>> sample accurate position where it had been stopped (the `offset`
>> >>> property in
>> >>> the start() method would be handy here) albeit using another instance
>> >>> of
>> >>> AudioBufferSourceNode. There are many other situations where
>> >>> AudioBufferSourceNode's playbackPosition would be useful, especially
>> >>> when
>> >>> used with looping.
>> >>>
>> >>> Using the currentTime to keep time to calculate the position doesn't
>> >>> work
>> >>> if the playbackRate of the AudioBufferSourceNode is being changed.
>> >>>
>> >>> I did find an old email on this list from 2011 talking about this. But
>> >>> there didn't seem any other discussion about such a property.
>> >>> http://lists.w3.org/Archives/Public/public-audio/2011OctDec/0143.html
>> >>>
>> >>> I haven't found any other method to get sample accurate
>> >>> playbackPosition.
>> >>> Am I missing something? Or is this something that can be added to the
>> >>> WD?
>> >>>
>> >>> -Chinmay Pendharkar
>> >>
>> >>
>> >
>
>

Received on Wednesday, 26 February 2014 22:47:19 UTC