Re: Multiple destinations in a single AudioContext

That's great Chris, thanks for the pointer. I will update the Implementation Notes for that use case. 

Olivier

On 13 Aug 2012, at 18:26, "Chris Rogers" <crogers@google.com> wrote:

> 
> 
> On Mon, Aug 13, 2012 at 8:53 AM, olivier Thereaux <olivier.thereaux@bbc.co.uk> wrote:
> Hello,
> 
> As you know, Joe and I have been working on our Use Cases and Requirement document, detailing the specific requirements illustrated by each of our scenario.
> 
> One of the scenario we added a few months ago was that of an "online DJ set".
> The current text of the scenario can be read at:
> http://dvcs.w3.org/hg/audio/raw-file/tip/reqs/Overview.html#connected-dj-booth
> 
> One interesting aspect to the scenario is that it illustrates the need for audio to be sent to two different destinations (headphones + streaming). Even more interestingly, the application needs to switch a given context from one destination to another gradually and seamlessly.
> 
> I cannot quite figure out how that could be done with the current draft of the Web Audio API. Ideally, it would look like this: https://dvcs.w3.org/hg/audio/raw-file/tip/reqs/DJ.png but given the API's constraint on the number of destinations a context can have, that is not possible.
> 
> This might just work:
> https://dvcs.w3.org/hg/audio/raw-file/tip/reqs/DJ2.png
> if a given AudioContext is allowed to have two AudioDestinationNodes in its graph, but only one is connected as destination at any given time. This is a much lesser solution, because the DJ cannot continue listening to headphones as she fades the second track in, but it might just be doable.
> 
> Am I missing an obvious alternative which would make this scenario a possibility? Any thought on how else you'd do it? Note that I am not saying that the API *MUST* enable this scenario - but it is interesting food for thought.
> 
> Olivier
> 
> Hi Olivier, you may have noticed the addition of createMediaStreamSource() to the specification.  It represents a source from a MediaStream, and work is underway in WebKit to support this now, including live audio input.  There will be a matching createMediaStreamDestination() for sending to remote peers.
> 
> For the online DJ case, where headphone output is required in addition to the "main" audio output then multi-channel devices can be supported as described so far in the spec (.numberOfChannels and .maxNumberOfChannels of AudioContext).  We're working hard on implementing that part right now.
> 
> p.s. The W3C server hosting the Web Audio spec seems to be down right now...
> 

http://www.bbc.co.uk/
This e-mail (and any attachments) is confidential and may contain personal views which are not the views of the BBC unless specifically stated.
If you have received it in error, please delete it from your system.
Do not use, copy or disclose the information in any way nor act in reliance on it and notify the sender immediately.
Please note that the BBC monitors e-mails sent or received.
Further communication will signify your consent to this.
					

Received on Monday, 13 August 2012 19:39:06 UTC