- From: Srikumar Karaikudi Subramanian <srikumarks@gmail.com>
- Date: Sat, 14 Sep 2013 14:19:36 +0530
- To: "robert@ocallahan.org O'Callahan" <robert@ocallahan.org>
- Cc: Jer Noble <jer.noble@apple.com>, Jussi Kalliokoski <jussi.kalliokoski@gmail.com>, Chris Wilson <cwilso@google.com>, Raymond Toy <rtoy@google.com>, "public-audio@w3.org" <public-audio@w3.org>
- Message-Id: <0A8F4BFA-FB16-4C80-8278-057E2EBD37CA@gmail.com>
I have some more questions on the original "what happens when you disconnect and reconnect source nodes?" problem - Noting that currentTime can advance between disconnect() and connect() calls, it seems to me that the intended audio behaviour cannot be reliably reproduced since these calls aren't time stamped. Such a reconnection seems reasonable to do in a realtime context, but what would such an act mean when done with an OfflineAudioContext, where any amount of time may pass between these two calls? My only guess is that a dev doing this is trying to reuse existing nodes as a hand optimization, not for any behaviour that needs to be gotten this way. It seems to me that the right way to convey the original intention is to stop() the source node, lose a reference to it, create a new source node on the same buffer, connect it to the second destination and start() it at the right time, with the right buffer playback offset. With this, you can get a sample-accurate and reliable switch from one input to another, which you cannot get with using only connect()/disconnect() calls, no matter how they are implemented. If the current pause/play mechanism discussion is about a solution to facilitate correct specification of the behaviour in the disconnect-reconnect case, then I'm not sure what exactly it would solve. Also, I've needed many different notions of "pause" in musical applications -- Do noteoffs abruptly stop? Do I let reverb tails run? Do I wait for the current bar to finish? -- that I'm not sure any one technical solution would be broadly applicable. The issue https://www.w3.org/Bugs/Public/show_bug.cgi?id=17422 was closed, with the editor saying "There are sufficient controls where this isn't needed but we can discuss that. Any scheduled events with AudioBufferSourceNodes can be cancelled. Any type of HTMLMediaElement can be paused. Volume can be muted on streams. I feel there is sufficient control already." ... and no further objections were raised. Are there any new use cases that have arisen to bring back discussion of graph pause/play? -Kumar On 14 Sep, 2013, at 5:52 AM, "Robert O'Callahan" <robert@ocallahan.org> wrote: > On Fri, Sep 13, 2013 at 4:23 PM, Jer Noble <jer.noble@apple.com> wrote: > On Sep 13, 2013, at 4:16 PM, Robert O'Callahan <robert@ocallahan.org> wrote: >> On Fri, Sep 13, 2013 at 1:50 PM, Jer Noble <jer.noble@apple.com> wrote: >> Of course, HTMLMediaElements must pause when they are removed from the DOM. So in the general case, disconnecting a playing media element and throwing away all references to it will allow it to be GCd. >> >> But you can call play() on a media element after removing it from the DOM. >> >> If we were to follow the media element model in Web Audio, we would introduce per-node pause() and resume() APIs and specify that when you disconnect the last output of a node, there is an implicit pause() call. > > I’d actually be fine with that. > > I might be too :-). > > One further hypothetical: should “pause” and “play” traverse down the graph? I.e., if you “pause” a ConvolutionNode, would the AudioBufferSourceNode feeding it pause as well. > > We need to support transitive pausing in some way, but I'm not totally sure how it should work. > > It might make sense to have a pauseSubgraph() method that behaves like this on a node N: > -- If N is paused, do nothing and stop. Otherwise: > -- Pause N. > -- Collect a set of nodes S defined by the following: > -- Add N to S. > -- If S contains E, and a node M's output is connected to E, and M is not paused, then add M to S. > (S contains the nodes which feed into N along a path of non-paused nodes.) > -- For each element E of S, if every E output is connected to a node in S, pause E. > (This pauses all the nodes which don't contribute to outputs other than through S.) > This means: > * A subgraph of nodes which only produce output via N and where none of them are all paused, all get paused. > * A node that produces output via N and also along a path not including N does not get paused. > * If a paused node is connected to N, then pauseSubgraph on N will not affect the inputs to the already-paused node. > > We would then make disconnect() of the last output of a node implicitly call pauseSubgraph(). We would probably want to add a disconnectWithoutPausing variant. > > For resumeSubgraph(), I would propose this: > -- If N is not paused, do nothing and stop. Otherwise: > -- Collect a set of nodes S defined by the following: > -- Add N to S. > -- If S contains E, and a node M's output is connected to E, and M is paused, then add M to S. > (S contains the nodes which feed into N along a path of paused nodes.) > -- For each element E of S, resume E. > So basically every node that is connected to N along a path of paused nodes is resumed. > > =============== > > An alternative approach would be to define a "paused" attribute on AudioNodes with three possible values: "paused", "running", "auto". Then for each node we compute the "effective paused state", which is true if the node is "paused", false if the node is "running", and for "auto" nodes is true if all output node's effective paused states are true. Take the "greatest fixed point" solution so that nodes in a cycle of "auto" nodes with no other outputs are paused. > > If we start every node in the "auto" state, that would match what Blink currently does (AIUI). But we could say that a node starts in the "running" state and switches to the "auto" state when its output is first connected to another node. That way, standalone nodes would work without any shenanigans. > > Rob > -- > Jtehsauts tshaei dS,o n" Wohfy Mdaon yhoaus eanuttehrotraiitny eovni le atrhtohu gthot sf oirng iyvoeu rs ihnesa.r"t sS?o Whhei csha iids teoa stiheer :p atroa lsyazye,d 'mYaonu,r "sGients uapr,e tfaokreg iyvoeunr, 'm aotr atnod sgaoy ,h o'mGee.t" uTph eann dt hwea lmka'n? gBoutt uIp waanndt wyeonut thoo mken.o w
Received on Saturday, 14 September 2013 08:50:30 UTC