Re: Split simulcast encodings into N RtpSenders

Didn't we just have this discussion?

https://lists.w3.org/Archives/Public/public-ortc/2016Apr/0020.html

Short answer: if an application wants to have one encoding in each
RtpSender/RtpReceiver, it is free to do so, but it's not a good idea for
various reasons, so the API gives the application a better way (multiple
encodings per RtpSender/RtpReceiver)

On Thu, Jun 2, 2016 at 2:12 PM, Iñaki Baz Castillo <ibc@aliax.net> wrote:

> Hi,
>
> The rules for sending simulcast (let's say VP8 and H264 at the same
> time over different streams) imply having a single RtpSender with two
> encodings (one for VP8 and another one for H264).
>
> But when it comes to the RtpReceiver, passing those parameters to
> receive() means that the receiver should be able to switch between
> both streams at any time. At explained in #558 that is hard and
> requires RFC 6051 and so on.
>
> So it's clear IMHO that, in the receiver side, it's much better to
> create two separate receivers so also different MediaStreamTracks for
> VP8 and H264, and render the active one.
>
> If so, why don't we encourage simulcast via N RtpSenders having each
> one a single encoding? That would produce a proper 1:1 mapping of
> RtpSenders and RtpReceivers.
>
> NOTE: Of course I'm talking about some kind of SFU scenario in which
> the SFU can decide which encoding to forward at any time.
>
> Is there any impediment regarding this topic due to WebRTC 1.0? In
> other words: would it make sense to request to the WebRTC WG that
> PeerConnection.getSenders() retrieves as many RtpSenders as available
> encodings?
>
> Well, given that those N simulcast encodings are expressed within a
> single m= line I can guess the problem... The RtpTransceiver should
> then have N RtpSenders rather than just one. Is that?
>
> Thanks a lot.
>
>
>
> [#558] https://github.com/openpeer/ortc/issues/558
>
>
> --
> Iñaki Baz Castillo
> <ibc@aliax.net>
>
>

Received on Thursday, 2 June 2016 21:51:48 UTC