Re: [webrtc-extensions] How does a developer decide on a value for `playoutDelay` ? (#46)

IMHO this api would be quite useless if we allow to set a `playoutDelayHint` of 50ms and we end with a jitterbuffer delay 2s for several minutes. That would still be the case if we set it to `interactive` and get 2s delays because internally the neteq decides that it is better to converge slowly than drop packets.

We could state that the hint is the minimum value that we want the jitter buffer can take, but again neteq can decide to slow ramp up and take several minutes until that delay is achieved.

While a bit more complicated, I think the best alternative is to be able to define `min` and `max` values for the jitter buffer that are strictly endorsed by the jitter buffer is set. If jitter is lower than the `min` value then it should buffer packets until the `min` is reached before starting playback. Also, if jitter is avobe `max`, packets should be dropped so delay is never bigger than the `max` value.


-- 
GitHub Notification of comment by murillo128
Please view or discuss this issue at https://github.com/w3c/webrtc-extensions/issues/46#issuecomment-663467140 using your GitHub account

Received on Friday, 24 July 2020 10:09:45 UTC