Re: [webrtc-pc] Missing specification on how to assign bandwidth between encodings and/or drop simulcast layers (#2141)

I don't like `minBitrate` at all. It may affect "super efficient" encoders. Just imagine you send audio with DTX and set `minBitrate: 20000`. During silence the bitrate will be mostly zero. What does it mean? Does it mean that the browser should "disable" such an encoding?

Coming back to the issue and proposals above:

As Adam Roach said in Twitter, there is some spec (and I don't know which one it is) that states in section 7.2:

> What simulcast streams to prioritize when allocating available bitrate among the simulcast streams in such adaptation SHOULD be taken from the simulcast stream order on the 'a=simulcast' line and ordering of alternative simulcast formats

Well, definitely `sendEncodings` is more than just simulcast. In the future we'll wish to declare SVC layer dependencies within multiple `sendEncodings` and, when that happens, there won't be any `a=simulcast` line in the SDP. So assuming that the order of `sendEncodings` determine the "bw allocation policy" for each encoding is just wrong.

Said that, we need a way to tell the encoder which simulcast streams we want to drop first if there were bw issues. Relying on the order of the given `sendEncodings` is just ugly (reasons given above). The order of `rids` in the `a=simulcast` line should not be determined by the order of the `sendEncodings` but by any other policy up to the application:

1. I may want to always send a very low resolution thumbnail video, and also send a medium and high resolution by prioritizing the high one (@murillo128 given use case in Twitter).
1. I may want to send two encodings of video with different `scaleResolutionBy` and `maxFramerate`, both with the **same** resulting effective bitrate, and having both the same "priority" (so instead of disabling one of them upon bw issues, just decrease the resolution and framerate of both if possible).

There is no single "algorithm" for the browser to know what the application wants, so IMHO the best way to go is having a explicit setting chosen by the app:

* In the "past" we had a `priority` field nobody never properly understood what it was for (BTW Chrome M75 rejects `addTransceiver()` if `priority` field is given in any encoding).
* We may reuse such a `priority` field or have a new one (such as the `dropOrder` field indicated above by Sergio).
* Whatever, but it MUST be chosen by the app. The browser **CANNOT** deduce it based on the order of `sendEncodings` nor based on `maxBitrate`, `scaleResolutionBy`, etc. values. It cannot deduce it because different legit use cases require different behaviors given the **SAME** encodings.

And a final proposal: Given such a `dropOrder` (or `priority` or whatever) value chosen by the app, the browser should build the `a=simulcast` line with an order of `rids` that honors those settings (rather than ordering `rids` based on their position within the `sendEncodings` array). This way we are "SDP simulcast compliant" (whatever that means) without constraining the exposed API and without conditioning the exposed API due to a wrong "simulcast" assumption (it may be SVC in the future, let's not assume what `sendEncodings` mean!).

GitHub Notification of comment by ibc
Please view or discuss this issue at using your GitHub account

Received on Sunday, 24 March 2019 14:32:44 UTC