- From: Sergio Garcia Murillo <sergio.garcia.murillo@gmail.com>
- Date: Fri, 23 Nov 2018 23:04:08 +0100
- To: public-webrtc@w3.org
On 23/11/2018 22:40, Harald Alvestrand wrote: > On 11/23/2018 11:02 AM, Sergio Garcia Murillo wrote: >> On 23/11/2018 10:52, westhawk wrote: >>> One problem that QUIC_theoretically_ solves is the congestion >>> control problem by bringing everything under >>> the same protocol, but only if you move all media to QUIC and ban RTP. >> Just for the sake of completeness, the other problem that QUIC solves >> is the sync between media and data, something that is not achievable >> today between SCTP and RTP (AFAIK). >> >> This is already contemplated on the webrtc NV uses cases and it is a >> critical issue for many applications (for example VR ones) > I've made a couple of stabs in the past at making correlation possible > between data and media (the easiest seems to be to offer access to the > RTP timestamp of the frames on both sides of the connection). But those > efforts have petered out for lack of interest. Not sure if rtp timestamp would be enough, as even if you can correlate them to sender clock source, it is calculated on sending time, not on capture time, so it can have a couple of frames mismatch (especially considering the high fps used in VR). IMHO the solution is to be able to append a small metadata to the media stream so it attach it to the (next) captured frame, it is copied to the encoded frame, and then transmitted as an rtp header extension in the first rtp packet of the frame. On the receiver side, it will trigger a metadata event on the media stream (or video element) when it is about to be displayed. This metadata can contain a sync point info, timestamp or even the data itself if it is small enough to fit in a header extension. Best regards Sergio
Received on Friday, 23 November 2018 22:00:57 UTC