- From: Chris Wilson <notifications@github.com>
- Date: Wed, 02 Oct 2013 08:59:24 -0700
- To: WebAudio/web-audio-api <web-audio-api@noreply.github.com>
- Message-ID: <WebAudio/web-audio-api/issues/8/25551065@github.com>
Relevant discussion thread: http://lists.w3.org/Archives/Public/public-audio/2013JulSep/1812.html. There are two schools of thought here: 1) Audio node/param connections are similar to 1/4" audio patch cables: they do not affect "time" for nodes on either side of the connection, time proceeds in a linear fashion. So, for example, if a BufferSource node was disconnected in the middle of playing, and then reconnected two seconds later, playback would resume two seconds further along, as if the node had continued playing while disconnected. Subgraphs would have similar effects; for example, a BufferSource connected to a convolution node would continue playing while the convolution node was disconnected (and you might hear a reverb tail even if reconnected after the BufferSource had ended). 2) Disconnected nodes (and subgraphs) have nothing pulling time along, so the subgraph is effectively paused while disconnected. This is the current implementation in Blink and Webkit. Problems with #1: enabling garbage collection of disconnected subgraphs may be more tricky. There is no way to "pause" a subnode or graph. Problems with #2: timelines are no longer shared; you cannot presume that when you call start( now + 2 ), that the playback will actually be locked into that timeframe, as disconnection of the node may disrupt the flow of time in that subgraph. Pausing of live input nodes (and media element source nodes?) is not really possible, so those nodes would be exceptions. --- Reply to this email directly or view it on GitHub: https://github.com/WebAudio/web-audio-api/issues/8#issuecomment-25551065
Received on Wednesday, 2 October 2013 15:59:52 UTC