Re: [webrtc-stats] Stats to keep track of sync between audio and video

Should this timestamp continously increasing between frames/audio 
samples, e.g. not discontinously jumping every time a frame (or audio 
buffer, how would that work?) arrives, that would complicate things if
 audio and video jumps by different amount of times or at different 
points in time.

What happens if you fall behind because there is nothing to play? 
`currentPlayoutTimestamp` continues to increase but 
`HTMLMediaElement.currentTime` ("remoteVideo.currentTime") stops?

Is the goal this?
```
// E2E delay
let e2eVideoDelay = videoTrackStats.currentPlayoutTimestamp - 
remoteVideo.currentTime;
let e2eAudioDelay = audioTrackStats.currentPlayoutTimestamp - 
remoteAudio.currentTime;
// A/V sync
let audioOffsetFromVideo = e2eVideoDelay - e2eAudioDelay;
```

Which unlike `remoteAudio.currentTime - remoteVideo.currentTime` works
 even if audio and video did not start sending at exactly the same 
time from the remote source.

-- 
GitHub Notification of comment by henbos
Please view or discuss this issue at 
https://github.com/w3c/webrtc-stats/issues/158#issuecomment-278330845 
using your GitHub account

Received on Wednesday, 8 February 2017 13:38:41 UTC