W3C home > Mailing lists > Public > public-webrtc@w3.org > July 2015

Re: I have created a PR for RtpEncodingParameters.maxBitrate

From: Silvia Pfeiffer <silviapfeiffer1@gmail.com>
Date: Sun, 19 Jul 2015 18:12:10 +1000
Message-ID: <CAHp8n2nCb+5qPXNSL=w10=95PxgrGDGCV5qG9nMjkM6F=wbM1w@mail.gmail.com>
To: Randell Jesup <randell-ietf@jesup.org>
Cc: public-webrtc <public-webrtc@w3.org>
On Sun, Jul 19, 2015 at 2:43 PM, Randell Jesup <randell-ietf@jesup.org> wrote:
> On 7/18/2015 11:07 PM, Silvia Pfeiffer wrote:
>> On Sun, Jul 19, 2015 at 12:41 PM, Randell Jesup <randell-ietf@jesup.org>
>> wrote:
>>> So there are a couple of issues around bitrates, not all equally
>>> important:
>>> * Control of the bitrate (max/min) for a specific stream
>> Unless you know the available bandwidth overall for the device and the
>> minimum bandwidth necessary to encode a stream of data, these are
>> actually pretty useless to a developer.
>> For example, the min bitrate required to send audio or video is codec
>> and resolution specific and the browser seems in a much better
>> position than the developer to know this boundary.
> Mostly I agree - and most apps shouldn't be messing with these. However,
> this isn't always the case. The app does have some control over the
> resolution/framerate, and also it may want to limit the bandwidth used by
> (for example) screensharing (or limit the bandwidth used for a thumbnail
> talking-head that's being layered on top of a screenshare at the receiving
> end).
> I will note this is a *common* request.  And apps are currently editing
> (Chrome) SDPs to adjust max bitrates.

My above paragraph was about min bitrates.

I agree that you want to set max bandwidth used. But preferably not to
an absolute number, but rather a relative number. Mostly that's about
leaving some bandwidth for other operations of the device.

> We've recently reduced the need to do this by (in Firefox) making the upper
> & lower bandwidth bounds depend on the resolution/framerate of the input
> video - throwing 2Mbps at a QQVGA (160x120) stream is pretty silly - in
> fact, 2Mbps is slight overkill even for VGA.  and 2Mbps is way too low for
> an HD 1080 stream, let alone a full-resolution screenshare of a Retina
> screen.


>> Similarly, the max bitrate is really dependent on the available
>> bandwidth to the device and may fluctuate. So, the browser should be
>> in a much better position to know this, too (despite Chrome currently
>> randomly choosing 2Mbps per video stream).
> Sure, congestion may well keep you below the max - max is just "don't go
> over this even if you seem to have the bandwidth".
>> As a developer, I would really like to see controls that are relative
>> to these boundaries, not absolute numbers. E.g. I'd like to tell each
>> outgoing/incoming video stream to use no more than 25% of my available
>> bandwidth - which would then adapt to the available bandwidth (looks
>> like RTP would then also need to negotiate the minimum of the two).
>> Then, e.g., the data channel should use no more than 40% of my
>> available bandwidth - since it's bursty, it can take a bit more.
> That requires unified congestion control - and for many use-cases, unified
> across multiple PeerConnections.  In non-bundle in theory the RTP
> destinations can be different, and it's possible routers will apply
> different constrictions per stream (especially if the DSCP marks are
> different).
> I would imagine that the real thing you'd want is to *not* give equal
> amounts to multiple video channels in a bundle, and that by default they'd
> get equal shares if resolutions are equal.  (I.e. I care more about this one
> than that).

Priority between different channels in a bundle is an interesting
parameter that the browser indeed needs to know about if we expect it
to deal with individual streams different. You can't tell a browser
your priority by setting bitrates - that's like saying "I don't know
how many apples I have, but I have a family of 3 to feed and I will
give the first person 6 apples, the second on 3 and the third the
rest." Then when it turns out you only have 2 apples to distribute,
how do you decide who gets how much?

Preferably, as a developer, I don't want to have to deal with setting
a limit at all, because the browser should have better information
about all the facts relevant to choosing a bitrate: the CPU used for
encoding and decoding, the bandwidth available for sending and
receiving, the bandwidth available for sending and receiving at the
other end, the number of dropped packets. I firmly believe that a
request to set bitrate manually by developers is based on observed
misbehaviour of browsers that do a bad job at picking an adequate
bitrate, which is a shame.

>> In any case, absolute numbers are always going to be wrong and
>> suboptimal, so I'd much prefer going with relative numbers.
> Maybe so.   But this is a different request than Maximum bitrate, you're
> talking about interacting with the parceling out of the available bits (when
> we don't have enough to hit max on all streams).

In a perfect world, the bitrate that I would choose for encoding a
video stream would be limited by the bitrate that I get for
transmitting the stream (including all the bandwidth that the network
intermediaries allow me), wouldn't it? If I encode at a higher bitrate
than I can send, then I'm basically screwed. If I encode at lower
bitrate, then I create a worse quality stream than would be possible.
This applies as much to the individual channels in a bundle as it
applies to the whole bundle.

Anyway - I actually do want a parameter to set limits on maximum
bandwidth (not so much on minimum though). I want it to be a
percentage though, because that allows me to provide priority hints to
the browser.

Received on Sunday, 19 July 2015 08:12:58 UTC

This archive was generated by hypermail 2.3.1 : Monday, 23 October 2017 15:19:45 UTC