On Sun, Oct 16, 2011 at 6:15 PM, Alistair MacDonald <al@signedon.com> wrote:
>
> It should integrate seamlessly with other MediaStream producers and
>> consumers, without bridging.
>>
>
> Could you add some detail to this explaining with/without bridging and why
> it is important?
>
For example, if you look at example 5 here:
https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/webrtc-integration.html
navigator.getUserMedia('audio', gotAudio);
function gotAudio(stream) {
var microphone = context.createMediaStreamSource(stream);
var backgroundMusic =
context.createMediaElementSource(document.getElementById("back"));
var analyser = context.createAnalyser();
var mixedOutput = context.createMediaStreamDestination();
microphone.connect(analyser);
analyser.connect(mixedOutput);
backgroundMusic.connect(mixedOutput);
The calls to "createMediaStreamSource" and "createMediaStreamDestination"
map MediaStream objects to AudioNode objects and vice versa. They are only
needed because AudioNodes and MediaStreams are separate worlds that need to
be explicitly bridged. That is unnecessary complication for authors,
compared to just supporting audio processing directly on MediaStreams.
Rob
--
"If we claim to be without sin, we deceive ourselves and the truth is not in
us. If we confess our sins, he is faithful and just and will forgive us our
sins and purify us from all unrighteousness. If we claim we have not sinned,
we make him out to be a liar and his word is not in us." [1 John 1:8-10]