- From: Elad Alon via GitHub <sysbot+gh@w3.org>
- Date: Fri, 27 Jan 2023 10:29:08 +0000
- To: public-webrtc-logs@w3.org
Thank you for migrating the discussion here. Repeating my last comment on the topic: > Is there a reason, other than implementation complexity[*], for auto-pause to happen on sources rather than tracks? > > [*] Normally a valid concern, but here I suspect the complexity is quite low, as pausing could be implemented by dropping frames where applicable. Later, the optimization could be added of not posting frames from the relevant source if all associated tracks are paused. Or wdyt? I think the comparison to `muted` and `enabled` is a response to this. But I am not sure how it applies. From a Web developer's point of view, the ability to interact with tracks as though they were independent sounds useful. Imagine you wish to process one track but not the other; maybe you're applying background blur to a track posted remotely, but showing an unmodified local preview. Why would the latter track need to be auto-paused and unpaused? -- GitHub Notification of comment by eladalon1983 Please view or discuss this issue at https://github.com/w3c/mediacapture-screen-share/issues/255#issuecomment-1406309333 using your GitHub account -- Sent via github-notify-ml as configured in https://github.com/w3c/github-notify-ml-config
Received on Friday, 27 January 2023 10:29:10 UTC