[webrtc-nv-use-cases] Add picture timing requirement to Ultra Low latency Broadcast with Fanout use case (#130)

murillo128 has just created a new issue for https://github.com/w3c/webrtc-nv-use-cases:

== Add picture timing requirement to Ultra Low latency Broadcast with Fanout use case ==
In traditional live streaming it is common place to send per-frame picture timing information, typically using the H264 Picture timing SEI messages (ITU-T REC H.264 annex D.2.3).

This timing information is later used to synchronize out of band events like  ad insertion using SCTE 35 splice events (SCTE 35 2023r1 section 6.3). Per frame picture timing is widely used in video editing tools.

Implementation/spec wise, we can use the [abs-catpure-time header extension](https://webrtc.googlesource.com/src/+/refs/heads/main/docs/native-code/rtp-hdrext/abs-capture-time) which is exposed in  [RTCRtpContributingSource captureTimestamp WebRTC Extension ](https://w3c.github.io/webrtc-extensions/#dom-rtcrtpcontributingsource-capturetimestamp).

Retrieving the picture timing information of the rendered frame is quite clumbsy at the moment at it requires interpolating values using the rtp timestamps of the frame which is the only available value apis exposing frames.

An attempt to add the capture timestamp values is being doing at different apis:
https://github.com/WICG/video-rvfc/issues/86#issuecomment-1978664937

Please view or discuss this issue at https://github.com/w3c/webrtc-nv-use-cases/issues/130 using your GitHub account


-- 
Sent via github-notify-ml as configured in https://github.com/w3c/github-notify-ml-config

Received on Tuesday, 5 March 2024 14:29:31 UTC