W3C home > Mailing lists > Public > public-webrtc@w3.org > November 2018

Re: Call for adoption - WEBRTC-QUIC

From: westhawk <thp@westhawk.co.uk>
Date: Sat, 24 Nov 2018 15:48:39 +0100
Message-Id: <717DEC9F-8BD8-41D8-AB97-86E259576B99@westhawk.co.uk>
Cc: public-webrtc@w3.org
To: Sergio Garcia Murillo <sergio.garcia.murillo@gmail.com>
I’m pretty sure you could already do this with the Blob event timestamp
 https://w3c.github.io/mediacapture-record/#blobevent-section <https://w3c.github.io/mediacapture-record/#blobevent-section> 
from a mediaStreamDestination 

You could tag Datachannel messages with that stamp and
delay their delivery on the receiver side ’till that media time has arrived.


> On 23 Nov 2018, at 23:04, Sergio Garcia Murillo <sergio.garcia.murillo@gmail.com> wrote:
> On 23/11/2018 22:40, Harald Alvestrand wrote:
>> On 11/23/2018 11:02 AM, Sergio Garcia Murillo wrote:
>>> On 23/11/2018 10:52, westhawk wrote:
>>>> One problem that QUIC_theoretically_  solves is the congestion
>>>> control problem by bringing everything under
>>>> the same protocol, but only if you move all media to QUIC and ban RTP.
>>> Just for the sake of completeness, the other problem that QUIC solves
>>> is the sync between media and data, something that is not achievable
>>> today between SCTP and RTP (AFAIK).
>>> This is already contemplated on the webrtc NV uses cases and it is a
>>> critical issue for many applications (for example VR ones)
>> I've made a couple of stabs in the past at making correlation possible
>> between data and media (the easiest seems to be to offer access to the
>> RTP timestamp of the frames on both sides of the connection). But those
>> efforts have petered out for lack of interest.
> Not sure if rtp timestamp would be enough, as even if you can correlate them to sender clock source, it is calculated on sending time, not on capture time, so it can have a couple of frames mismatch (especially considering the high fps used in VR).
> IMHO the solution is to be able to append a small metadata to the media stream so it attach it to the (next) captured frame, it is copied to the encoded frame, and then transmitted as an rtp header extension in the first rtp packet of the frame. On the receiver side, it will trigger a metadata event on the media stream (or video element) when it is about to be displayed. This metadata can contain a sync point info, timestamp or even the data itself if it is small enough to fit in a header extension.
> Best regards
> Sergio

Received on Saturday, 24 November 2018 14:49:05 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:18:45 UTC