Re: [webrtc-extensions] How does a developer decide on a value for `playoutDelay` ? (#46)

I wanted to weigh in in favor of some more fine grained control than just an enum. As an application developer, I may have a different opinion about what constitutes glitch resilience than a browser developer does. Also, WebRTC has a stats API -- it's not like app developers aren't flying completely blind. 

What if i think >= 1 dropped frame and >= 10 nacks every minute means that not enough resilience is being provided?
10 dropped frames and 100 nacks?
An RTT of > 100ms?
Or some other combination of these things that I came up with through a lot of experimentation? 

And what if I'm willing to give up some latency for some greater glitch resilience, but I have a hard cutoff on how much latency I'm willing to give up? (e.g. 300ms, 500ms, 700ms, etc)

And if I identify a technique for scaling up the delay at a rate of my own choosing, also found through experiments with my own application's specific use case?

This smells like something that needs a scalar value.

Also providing an enum sounds fine, though.

P.S. If the assertion is that there is not enough data available to the application to build a good heuristic for adjusting this value, let's beef up the stats, not remove the ability to adjust the value!

-- 
GitHub Notification of comment by AndrewJDR
Please view or discuss this issue at https://github.com/w3c/webrtc-extensions/issues/46#issuecomment-663414523 using your GitHub account

Received on Friday, 24 July 2020 08:42:05 UTC