- From: Ian Hickson <ian@hixie.ch>
- Date: Mon, 18 Jul 2011 18:45:52 +0000 (UTC)
- To: Stefan Håkansson LK <stefan.lk.hakansson@ericsson.com>
- cc: public-webrtc@w3.org, public-audio@w3.org
- Message-ID: <Pine.LNX.4.64.1107181842530.3775@ps20323.dreamhostps.com>
On Mon, 18 Jul 2011, Stefan Håkansson LK wrote: > > In > <http://datatracker.ietf.org/doc/draft-ietf-rtcweb-use-cases-and-requirements/> > there is a use case that uses two cameras: Hockey game viewer. There one > camera is used to show the game, the other a view of a person attending > the match (and participating in an on-line discussion about the game). > > I guess that if the web app gets access to both cameras switching views > would be simple to implement. You could be lazy and transmit both videos > all the time but select which one is displayed in the video element, or > you could enable/disable tracks or add/remove streams at the sending > side if you like to save transmission. The current API already supports this -- you'd just get the two camera using getUserMedia(), add both streams, and then mute the one you don't want to send. We may want to consider a way to merge MediaStream objects into one (currently there's only the ability to fork), so that you could just send one stream and dynamically pick one or the other. But it seems if we do that we'd probably also want to be able to do fades and other transitions for the video, and of course we'd probably want this to hook right into the audio API. It's unfortunate that this area is split across multiple working groups at the W3C. -- Ian Hickson U+1047E )\._.,--....,'``. fL http://ln.hixie.ch/ U+263A /, _.. \ _\ ;`._ ,. Things that are impossible just take longer. `._.-(,_..'--(,_..'`-.;.'
Received on Monday, 18 July 2011 18:46:19 UTC