W3C home > Mailing lists > Public > public-webrtc-logs@w3.org > April 2019

Re: [webrtc-pc] Clarify the definition of "playout" for `RTCRtpContributingSource`. (#2172)

From: henbos via GitHub <sysbot+gh@w3.org>
Date: Thu, 25 Apr 2019 11:18:07 +0000
To: public-webrtc-logs@w3.org
Message-ID: <issue_comment.created-486631219-1556191085-sysbot+gh@w3.org>
> ...when its audio or video frame is delivered to RTCRtpReceiver's MediaStreamTrack for playout...

Playout does not happen on a track. The track is just the middleman, playout happens some time after passing samples/frames off to its sinks. Isn't that right, @guidou ?

> "Playout" is 3. Isn't that clear? This is what Firefox implements. The primary goal of this info is audio bars visually synced with playing video.

Which playout? As far as I can tell, the last step that is within the scope of WebRTC is delivery of samples/frames to the remote MediaStreamTrack. But the track may have 0, 1 or multiple sinks. Each sink could in theory add additional delay. Maybe they cause playout on different speakers, with different software and hardware buffers.

If playout means "the track is handing off samples/frames to its sink" then I think I understand it. If you are going further than that, then it sounds like you are going out-of-scope of WebRTC and Media Capture and Streams.

Am I misunderstanding how this works?

When I was advising the author of https://github.com/w3c/webrtc-pc/issues/2177, @drkron, this is what I told him.

Whether or not "playout" is a sensible description, I do think it needs to be clarified.

GitHub Notification of comment by henbos
Please view or discuss this issue at https://github.com/w3c/webrtc-pc/issues/2172#issuecomment-486631219 using your GitHub account
Received on Thursday, 25 April 2019 11:18:08 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 9 October 2019 15:15:03 UTC