W3C home > Mailing lists > Public > public-webrtc@w3.org > April 2012

Re: Tracks in multiple Stream . Re: Teasing apart the data API questions

From: Randell Jesup <randell-ietf@jesup.org>
Date: Mon, 23 Apr 2012 17:30:09 -0400
Message-ID: <4F95C9E1.10002@jesup.org>
To: public-webrtc@w3.org
On 4/23/2012 5:08 PM, Harald Alvestrand wrote:
> On 04/23/2012 05:01 PM, Cullen Jennings wrote:
>> On Apr 15, 2012, at 6:42 , Harald Alvestrand wrote:
>>> B1: Should the data "channel" be similar to a MediaStreamTrack,
>>> including the ability to be part of one or more MediaStreams,, be
>>> connected to consumer entities, be muted, and so on?
>> I'd like to back up and talk about normal Tracks being attached to
>> more than one stream. I don't think this makes any sense and certainly
>> adds to implementation complexity. Lets say we have tracks A , B, and
>> C. And tow streams S1 and S2. If A and B are in S1, and B and C are in
>> S2, it effectively mens all three are synchronized so why not just
>> have all there in a single stream?
>> I am missing the use case for a single track in more than one stream.
> The classical example is when one uses GetUserMedia to get audio and
> video, and one wishes to extract only the video stream and show it in a
> preview element, while both the audio and video should be sent to the
> remote site via a PeerConnection.

Correct.  Or attach audio from a single mic to two media streams, one 
for the front camera and one for the rear.  And the two streams may have 
totally different destinations (different peerconnections, or local 
display per Harald).

Randell Jesup
Received on Monday, 23 April 2012 21:31:20 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:17:27 UTC