- From: Chris Rogers <crogers@google.com>
- Date: Wed, 9 Jan 2013 10:32:05 -0800
- To: robert@ocallahan.org
- Cc: public-audio@w3.org
- Message-ID: <CA+EzO0=PqBdpGU=j6hJ3A5e2suDfqg=5qyo=5eyUtcRoOg=v8g@mail.gmail.com>
On Wed, Jan 9, 2013 at 6:09 AM, Robert O'Callahan <robert@ocallahan.org>wrote: > Nothing says this explicitly in the spec, but it seems to me that this is > possible as written. For example, if you have a mono input connected to a > GainNode and later add an additional stereo input, the GainNode will start > producing stereo. Is this intended? > Yes, this is the case for nodes such as GainNode. DelayNode and WaveShaperNode are similar in this way. It all happens transparently without the JS developer needing to worry about the details. I'll try to be more explicit in the spec to point out exactly where this happens. Here are some other cases where the number of channels can change, due to other causes. * MediaElementAudioSourceNode, if the .src attribute were set and the number of channels were different. * AudioBufferSourceNode: if the .buffer is changed to one with a different number of channels * ChannelMergerNode: if the channels for the inputs, or number of connected inputs changed... As an implementation detail, any necessary re-configuration should probably happen in between each "rendering quantum" of each 128 sample-frame block, and not during the time when the graph is right in the middle of processing the block Chris > > Rob > -- > Jesus called them together and said, “You know that the rulers of the > Gentiles lord it over them, and their high officials exercise authority > over them. Not so with you. Instead, whoever wants to become great among > you must be your servant, and whoever wants to be first must be your > slave — just as the Son of Man did not come to be served, but to serve, > and to give his life as a ransom for many.” [Matthew 20:25-28] >
Received on Wednesday, 9 January 2013 18:32:32 UTC