W3C home > Mailing lists > Public > public-webrtc@w3.org > July 2011

Bitrate/quality issues (comments on minutes)

From: Randell Jesup <randell-ietf@jesup.org>
Date: Mon, 25 Jul 2011 09:31:38 -0400
Message-ID: <4E2D703A.6020702@jesup.org>
To: public-webrtc@w3.org
17:10] <burn> Cullen:receiving video, bit rate is being adjusted, should 
we know the other side is doing this?when the media we're receiving 
changes in some way, do we want to be notified?
[17:10] <burn> Roni:why would we?
[17:10] <burn> s/notified/notified in JS/
[17:10] <burn> Cullen:may want to change my screen resolution

Notification on incoming resolution change: yes.  On bitrate change?  I 
assume bitrate for
video streams is always changing - it's never constant.  So allow the 
application to query
the current bitrate at any time. What they do with that is their business.

[17:11] <burn> ... for bit rate, if all my streams just dropped their 
bit rate I may in the JS decide to close some of my streams.
[17:11] <burn> (general agreement that this is useful info)
[17:12] <burn> Christer:if quality is decreasing, for example, could 
remove video to improve audio.

Careful: are you talking incoming bitrate, or outgoing?  (above said 
incoming;
this sounds like outgoing).  Yes, it's useful to be able to query it.  
We *could*
set triggers for notification - this is more complex and I don't think 
the added
utility is worth the complexity.

[17:13] <burn> DarylMalis??:good to collect and make use of this.My 
concern is that this info in practice is often used only to decrease 
quality of the end result but never improve.
[17:13] <burn> Tim:bitrate is a terrible proxy for quality
[17:13] <burn> ... maybe everyone stopped moving or talking
[17:13] <burn> ... exposing quality info is very codec-spceific

Agreed, bitrate is a horrible stand-in for quality.  Not to mention  I hate
codecs/rate-controllers that try to keep a level quality (and drop 
bitrate when
less is happening), compared to codecs that try to always provide the 
highest
quality they can given a bitrate.  (Eventually it hits diminishing 
returns to use more
bits when things are static, so limiting it at some point is ok - but 
the limit should
be relatively high.)

We should consider warning in the spec not to assume bitrate == quality.
Application developers WILL make that mistake.

[17:14] <burn> Magnus:this is really about providing congestion info, right?
[17:14] <burn> hta:this is difficult to do in real time.
[17:14] <burn> ... we can get info on sender's changes.
[17:15] <burn> Cullen:trying to keep this simple.eg either sender 
changed resolution or reduced cap on bandwidth.
[17:15] <burn> Tim:difficult to detect cap on bandwidth
[17:16] <burn> Daryl:with clients using adaptive bitrates, they will 
lower the rate when nothing's happening and then increase back up when 
there is motion/sound.
[17:16] <burn> EKR:what we need is a way for the sender to say to the 
receiver "I'm having to back off here"

And there's the question of what the receiver would do with that info.  
I'm not sure
there's utility for the receiver here.

We also need to talk (in the IETF) about mechanisms for congestion control.

[17:17] <burn> Cullen:summary is we like this but it's hard and we don't 
really know how to do it properly (like packet loss concealment)
[17:17] <burn> ... presuming going to legacy devices via gateways.Do we 
have enough signaling info?
[17:18] <burn> Matthew:out of scope.


-- 
Randell Jesup
randell-ietf@jesup.org
Received on Monday, 25 July 2011 13:33:16 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 25 July 2011 13:33:17 GMT