Re: [webrtc-pc] Clarify that getSynchronizationSources() should return information even if the track has no sink (<video> tag) (#2240)

It seems like Chrome is picking up side-effects from trying to meet the new spec definition of when *"RTP packets is delivered to the RTCRtpReceiver's MediaStreamTrack"*. As I recall, that definition was a compromise time-point somewhere between RTP arrival and playout—it's still not a JS observable time-point, and not guaranteed to be the same in every implementation, so why?

Because it was as close as we could normatively get to playout without involving other specs. After all, the synchronous `getSynchronizationSources()` method's intent remains to help draw volume bars next to people talking (which can be derived solely from RTP) which meant A) doing so whether you hear them or not, and B) have the bars sync with them talking when you do.

Importantly: I don't think this change was meant to introduce dependencies on having sinks, nor to impact *getStats* in any way, whose mentions of audio AFAIK predates this change.

-- 
GitHub Notification of comment by jan-ivar
Please view or discuss this issue at https://github.com/w3c/webrtc-pc/issues/2240#issuecomment-517841403 using your GitHub account

Received on Friday, 2 August 2019 20:55:05 UTC