- From: Florent Castelli via GitHub <sysbot+gh@w3.org>
- Date: Mon, 12 Aug 2024 14:36:58 +0000
- To: public-webrtc-logs@w3.org
For scenario 2, you could have something like: ``` let t = pc.addTransceiver('video'); await negotiate(); let sendStream = await t.sender.replaceSendStream(); while (true) { let packet = nextPacketFromWebCodecs(); sendStream.sendRtp(packet); } ``` There, the media is signaled, the WebCodec will produce frames which are packetized and then sent, without involving the traditional WebRTC media pipeline. The application layer is free to not use any layers based on BWE or other signals (no pure H264 users? Don't send H264 to the SFU and upgrade to AV1). If you wanted to use a custom codec, then we have spec to cover that and create custom ones (Harald's work). We could have something similar for RTP Header Extensions too and it should have about everything covered then. Or is anything missing? -- GitHub Notification of comment by Orphis Please view or discuss this issue at https://github.com/w3c/webrtc-rtptransport/issues/64#issuecomment-2284162644 using your GitHub account -- Sent via github-notify-ml as configured in https://github.com/w3c/github-notify-ml-config
Received on Monday, 12 August 2024 14:36:59 UTC