- From: Chris Wilson <cwilso@google.com>
- Date: Wed, 26 Feb 2014 11:09:06 -0800
- To: Katelyn Gadd <kg@luminance.org>
- Cc: Chinmay Pendharkar <chinmay.pendharkar@gmail.com>, "public-audio@w3.org" <public-audio@w3.org>
- Message-ID: <CAJK2wqUf_GUtWJRKiFj9=L2Yma01n9FvKxGqtoRva3HCCW1aQw@mail.gmail.com>
? I'm confused by what you mean by "mixer thread". Do you mean the Web Audio processing thread? The processing thread is where accurate changes need to happen, of course - but your "pause/resume" API would be called from your code, in the main JS thread, which schedules most changes so they CAN happen at precise times. The Web Audio processing thread *is* the master clock Yes, it's possible to precisely synchronize two tracks in time; the scheduler in Web Audio is extremely precise. The start() will not, by default, "make up" if it is scheduled too late by skipping time, it will just start when it can ("if the value is less than currentTime, then the sound will start playing immediately." - should probably edit that to say "at its initial start position".) On Tue, Feb 25, 2014 at 10:04 AM, K. Gadd <kg@luminance.org> wrote: > Note that this is a scenario where the explicit pause/resume APIs that > have been requested before would enable a more accurate implementation > of pausing/resuming playback, because the mixer thread would be able > to accurately pause playback at whatever the current time is by the > time it receives the command from JS. As-is, any attempt to > pause/resume will naturally introduce some amount of drift in cases > where the mixer thread gets ahead of JS (after JS has recorded the > current time) or where the JS is ahead of the mixer thread (because > we've optimistically set currentTime somewhere in the future to try > and keep ahead of the mixer thread). In either case there will be > discontinuities in playback (though hopefully they will be small > enough that the user won't notice them) > > It is at least possible to do explicit synchronization since you can > pass an exact time when starting playback, so that's good. However, if > I have one track already playing and I want to align a second track > with it, is that possible given the current API and mixer guarantees? > Isn't it possible for the mixer to have gotten ahead of currentTime, > meaning that the new track will end up misaligned? Or will the mixer > instead try to satisfy the start time I provided and skip the first > couple dozen milliseconds of the new track? > > On Mon, Feb 24, 2014 at 7:39 PM, Chinmay Pendharkar > <chinmay.pendharkar@gmail.com> wrote: > > Thanks Chris. > > > > I will try counting time. I'm guessing I'll have to somehow hook all > > AudioParam methods and pre-calculate the index will be at the end of each > > Processing Event. > > > > OK. Agreed that sample accurate indices are not really going to be > useful. I > > understand the conflict between knowing the 'true' position of playback > and > > using it to do any kind of processing (especially on the JavaScript > side). > > > > But it will be useful to know where the playback stopped after a "stop" > > call. Or a rough estimate of the index could be useful to do some types > of > > synchronisation between two tracks (start the second one from where the > > first left off, etc). And since that index is already tracked > internally, > > it make sense to expose it instead of tracking and calculating it again > > outside in JavaScript. > > > > On a related note, would such a readonly parameter also be useful on the > > OscillatorNode as well? Especially when used with a user provided > > PeriodicWave. I haven't used PeriodicWaves much so I can't think of a use > > case immediately, but SourceBufferNode with .loop set to 'true' and a > > PeriodicWave do seem very similar in their working so it might be > applicable > > to the OscillatorNode as well. > > > > I will file an issue on the spec later today. > > > > Thanks again. > > > > -Chinmay > > > > > > > > On Tue, Feb 25, 2014 at 2:17 AM, Chris Wilson <cwilso@google.com> wrote: > >> > >> Hey Chinmay, > >> > >> Although I've specifically asked for this feature before - notably, in > >> playing around with DJ software on Web Audio, I found this would have > been > >> really useful - because exactly as you said, it gets challenging to keep > >> track of current time when playbackrate keeps changing (I *do* keep > track in > >> that app, even through linear ramps of playback rate, but the math > starts > >> getting more challenging when you use log ramps). > >> > >> However, I would point out that you're not likely to have > "sample-accurate > >> sample indices" - just because while you're off twiddling with > something, > >> the current time may change. Also, you can't really hack at looping > like > >> this, for the same reason we don't can't really have an efficient > >> "loopEnded" event - because it may happen at far too high a frequency > for us > >> to fire DOM events and expect results. In keeping with the rest of the > API, > >> of course, this should be a time not a sample index. > >> > >> We could expose something that would be useful as you suggested - to > know > >> where the playback was when you called "stop", or to even to schedule > ahead > >> - but I'd caution against it being a panacea. I'd also suggest that > this > >> should NOT be a writeable parameter, and certainly should not be an > >> AudioParam - it would conflict with playbackRate then. > >> > >> Would you like to file an issue on the spec? > >> https://github.com/WebAudio/web-audio-api/issues > >> > >> -Chris > >> > >> > >> On Fri, Feb 21, 2014 at 12:51 AM, Chinmay Pendharkar > >> <chinmay.pendharkar@gmail.com> wrote: > >>> > >>> Hello, > >>> > >>> Let me first thank everyone in this group for all the work they've put > >>> into the WebAudioAPI. I'm must say I'm enjoying using the WebAudioAPI > both > >>> as a developer and consumer. > >>> > >>> I'm trying to implement a interactive effect/synthesis library using > the > >>> WebAudioAPI, and I found myself looking for a `playbackPosition` > property on > >>> the AudioBufferSourceNode which exposes the sample index which is being > >>> played currently. > >>> > >>> A simple use case would be to implement a pause like functionality on > >>> AudioBufferSourceNode, which would allow the playback to resume at the > >>> sample accurate position where it had been stopped (the `offset` > property in > >>> the start() method would be handy here) albeit using another instance > of > >>> AudioBufferSourceNode. There are many other situations where > >>> AudioBufferSourceNode's playbackPosition would be useful, especially > when > >>> used with looping. > >>> > >>> Using the currentTime to keep time to calculate the position doesn't > work > >>> if the playbackRate of the AudioBufferSourceNode is being changed. > >>> > >>> I did find an old email on this list from 2011 talking about this. But > >>> there didn't seem any other discussion about such a property. > >>> http://lists.w3.org/Archives/Public/public-audio/2011OctDec/0143.html > >>> > >>> I haven't found any other method to get sample accurate > playbackPosition. > >>> Am I missing something? Or is this something that can be added to the > WD? > >>> > >>> -Chinmay Pendharkar > >> > >> > > >
Received on Wednesday, 26 February 2014 19:09:34 UTC