Re: [webrtc-stats] Define reception time for jitterBufferDelay stat (#549)

I think the time a packet is received is hard to pinpoint exactly. Packets go through many layers before reaching WebRTC code, such as the network card, the OS, various Chrome layers. Each of these layers can add some delay (especially during high-CPU load), so starting the timer when the packet reaches WebRTC code is a bit arbitrary. 

Ideally there should be very little delay between a packet reaching WebRTC and reaching NetEq, and in the vast majority of cases this is true. If there is high delay in this part of the code, that should be considered to be a bug in the implementation.

The name "jitterBufferDelay" suggests that it is the delay that is added by the jitter buffer, and this delay is different from other accidental sources of delay, because the implementation actively chooses to add it. There is also delay after audio leaves the jitter buffer, for example in Chrome layers, the audio driver and the audio hardware, and we also do not include these delays in this metric.

-- 
GitHub Notification of comment by ivocreusen
Please view or discuss this issue at https://github.com/w3c/webrtc-stats/issues/549#issuecomment-583443296 using your GitHub account

Received on Friday, 7 February 2020 15:10:33 UTC