- From: henbos via GitHub <noreply@w3.org>
- Date: Fri, 03 Oct 2025 07:08:26 +0000
- To: public-webrtc-logs@w3.org
> My preference would be to resolve https://github.com/w3c/webrtc-pc/issues/3077 first before merging this. I intend to present both at then next WG meeting and it sounds like we have agreement there. But to clarify this comment I made: > I believe that receiver.onssrcchange can NOT be used to shim track.onunmute because onssrcchange fires at decode time while onunmute fires at packet reception time, and in Chrome we don't decode (audio) unless we have a sink What I mean is that one of the use cases for unmute that has been used in examples is `track.onunmute = () => mediaTag.srcObject = stream /w track;` and for audio trying to do this: `track.onssrcchange = () => audioTag.srcObject = stream /w track;` would simply never fire because you need to set srcObject first in order to unblock decoding and you need decoding for onssrcchange to fire. But for `track.onunmute` you don't need to have decoding happening first because it fires as soon as the packet is received, which is also a good think for the "set srcObject" use case because attaching sinks at packet reception means you have <jitter buffer length> amount of time to set the srcObject without dropping any frame. If you were reacting to "frame decoded" then that frame would likely be dropped because you didn't set srcObject in time. So it's an argument for "we need both APIs regardless". -- GitHub Notification of comment by henbos Please view or discuss this issue at https://github.com/w3c/webrtc-extensions/pull/243#issuecomment-3364526835 using your GitHub account -- Sent via github-notify-ml as configured in https://github.com/w3c/github-notify-ml-config
Received on Friday, 3 October 2025 07:08:27 UTC