Re: AudioDestinationNode > 2 channels?

I see... Sorry I didn't answer the question at all and actually I don't
know if it is possible.

Maybe with the AudioChanelMerger interface ? It seems like the
AudioDestinationNode deals with the hardware since it only has one input. I
tried to console.log (context.destination) and it detected the two channels
of my laptop soundcard...

I can't help more, sorry !

Gabriel

2011/12/1 Robert Clouth <rob.clouth@gmail.com>

> Hi Gabriel, cheers for the reply. In the examples you gave the streams are
> all being summed at the destination. I need to be able to keep the
> channels separate. For example for a 5.1 panner you need to send 5 channels
> to the soundcard (if your device supports it), not just summing them then
> sending them to the default two. How do I pick what output on the soundcard
> to send to using context.destination?


> Yeah, MediaElementAudioSourceNode hasn't been implemented yet.
>
Thanks ! Good to know, I'll keep an eye on it.

>
>
> Cheers,
>
> Rob
>
>
> On Thu, Dec 1, 2011 at 12:48 PM, Gabriel Cardoso <gcardoso.w@gmail.com>wrote:
>
>> Hi,
>>
>>
>> https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html#MixerGainStructure-section
>>
>> Isn't it what you need ?
>>
>> You can connect multiple buffers to the context.destination and use the
>> noteOn method to launch a buffer at the time you want. In your case, I
>> think you want to create two gain nodes connected to the destination and
>> connect one track to each.
>>
>> function loadSource (track, url) {
>>   // Load asynchronously
>>   var request = new XMLHttpRequest();
>>   request.open("GET", url, true);
>>   request.responseType = "arraybuffer";
>>
>>   request.onload = function() {
>>     track.buffer = context.createBuffer(request.response, false);
>>   }
>>
>>   request.send();
>> }
>>
>> var track1 = context.createBufferSource ();
>> loadSource (track1, aURL);
>> var track2 = context.createBufferSource ();
>> loadSource (track2, anotherURL);
>>
>> var gainNode1 = context.createGainNode ();
>> var gainNode2 = context.createGainNode ();
>>
>> track1.connect (gainNode1);
>> track2.connect (gainNode2);
>>
>> gainNode1.connect (context.destination);
>> gainNode2.connect (context.destination);
>>
>> I took the loadSource function from the Drum Machine example<http://chromium.googlecode.com/svn/trunk/samples/audio/shiny-drum-machine.html>. Then
>> you can use the noteOn method on track1 and track2 to play the tracks.
>>
>> The problem is that it's said about the AudioBuffer interface that the
>> length of the PCM data should be less than a minute. "For longer
>> sounds, such as music soundtracks, streaming should be used with
>> the audio element andMediaElementAudioSourceNode.
>>
>> The thing is, I didn't manage to connect and control streaming sources...
>> don't know if it is or will be possible or if it's not implemented yet.
>> Does anyone have an idea ?
>>
>> Bye,
>>
>> Gabriel
>>
>>
>> 2011/11/28 Robert Clouth <rob.clouth@gmail.com>
>>
>>> Hi there,
>>>
>>> Firstly, great work with the Web Audio API. It's opened up so many
>>> exciting new avenues! I'm currently working on a WebKit-based DJ app which
>>> depends on being able to send more than two channels of the audio hardware,
>>> so you can play one track while listening to another. Is this possible with
>>> the AudioDestinationNode interface? I noticed there's 5.1 functionality so
>>> presumably it's possible but I can't seem to figure out how. There doesn't
>>> seem to be any examples on the web. Any help would much appreciated!
>>>
>>> Cheers,
>>>
>>> Rob
>>>
>>>
>>
>

Received on Thursday, 1 December 2011 20:46:41 UTC