- From: Harald Alvestrand <harald@alvestrand.no>
- Date: Wed, 2 Nov 2016 21:38:04 +0100
- To: Cullen Jennings <fluffy@iii.ca>
- Cc: "public-webrtc@w3.org" <public-webrtc@w3.org>
On 11/02/2016 07:19 PM, Cullen Jennings wrote: > That's interesting but for video, we might consider a way to access SMPTE timecodes too. Are SMPTE timecodes carried across RTP for all codecs? I chose to expose the RTP timestamp because I know this is known with the same value on both sides. > >> On Oct 28, 2016, at 8:19 AM, Harald Alvestrand <harald@alvestrand.no> wrote: >> >> I've just submitted the following proposal to the WICG: >> >> https://discourse.wicg.io/t/proposal-a-frame-level-event-logging-mechanism-for-webrtc/1780/1 >> >> The chairs of WEBRTC felt that this was not appropriate work to add to this WG at a time when we're trying to finish up our current document set, but it might be of interest to participants in this group. >> Copied from the text there: >> >> >> Problem >> Some projects using WebRTC functionality have indicated that they need to record data on "source to screen" performance - that is, information about how much time it takes between some event (typically frame capture) occurs in "real life" on the media-generating side and the serving of the same event to the user on the media-consuming side. >> >> Approach >> I've sketched out an approach in this repo: >> >> https://github.com/alvestrand/webrtc-framelog/ >> >> It consists of a mechanism to tell the media engine to note the times certain events happen to a frame, and a way to get these notes back to JS. It's intended to be predictable and not too resource intensive. >> >> Comments are welcome; guidance to the WICG process as well - this is my first attempt to use this forum for an API proposal. >> >> > -- Surveillance is pervasive. Go Dark.
Received on Wednesday, 2 November 2016 20:38:41 UTC