- From: Erik Språng via GitHub <sysbot+gh@w3.org>
- Date: Fri, 04 Oct 2024 18:44:06 +0000
- To: public-webrtc-logs@w3.org
> Depending on the math behind `totalCorruptionProbability` and `totalSquaredCorruptionProbability` and the fact that these are `double`s and if these are ever-increasing quantities, could there be numerical issues mounting by incrementing over longer times? Doubtful. Even in the case where a call is run for a very very long time, remember that the individual samples are capped in the [0.0, 1.0] range and are likely only (individually) useful to a few decimal places. With a base of 52 bits, that's still an enormous amounts of samples that can be accumulated without noticeable loss. Even further reducing the scope of this problem is that we expect the "normal" values to be essentially 0.0. If you run extremely long sessions with constant corruption, might you have other problems to prioritize first. -- GitHub Notification of comment by sprangerik Please view or discuss this issue at https://github.com/w3c/webrtc-stats/pull/788#issuecomment-2394362074 using your GitHub account -- Sent via github-notify-ml as configured in https://github.com/w3c/github-notify-ml-config
Received on Friday, 4 October 2024 18:44:07 UTC