[webrtc-stats] jitterBufferDelay vs playoutDelay

vr000m has just created a new issue for https://github.com/w3c/webrtc-stats:

== jitterBufferDelay vs playoutDelay ==
In #217, we defined jitterBufferDelay based on the discussion in https://github.com/w3c/webrtc-stats/issues/151.

it currently defined as: 

```
 It is the total time each audio sample or video frame takes from the time it
is received to the time it is rendered. The delay is measured from the time
the first packet belonging to an audio/video frame enters the jitter buffer
to the time the complete frame is sent for rendering after decoding. The average 
jitter buffer delay can be calculated by dividing the <code>jitterBufferDelay</code>
with the <code>framesDecoded</code> (for video) or <code>totalSamplesReceived</code> (for audio).
```

There was some feedback on renaming this as `playoutDelay`. 

Please view or discuss this issue at https://github.com/w3c/webrtc-stats/issues/223 using your GitHub account

Received on Wednesday, 7 June 2017 08:49:27 UTC