- From: Cullen Jennings <fluffy@iii.ca>
- Date: Wed, 2 Nov 2016 12:19:17 -0600
- To: Harald Alvestrand <harald@alvestrand.no>
- Cc: "public-webrtc@w3.org" <public-webrtc@w3.org>
That's interesting but for video, we might consider a way to access SMPTE timecodes too. > On Oct 28, 2016, at 8:19 AM, Harald Alvestrand <harald@alvestrand.no> wrote: > > I've just submitted the following proposal to the WICG: > > https://discourse.wicg.io/t/proposal-a-frame-level-event-logging-mechanism-for-webrtc/1780/1 > > The chairs of WEBRTC felt that this was not appropriate work to add to this WG at a time when we're trying to finish up our current document set, but it might be of interest to participants in this group. > Copied from the text there: > > > Problem > Some projects using WebRTC functionality have indicated that they need to record data on "source to screen" performance - that is, information about how much time it takes between some event (typically frame capture) occurs in "real life" on the media-generating side and the serving of the same event to the user on the media-consuming side. > > Approach > I've sketched out an approach in this repo: > > https://github.com/alvestrand/webrtc-framelog/ > > It consists of a mechanism to tell the media engine to note the times certain events happen to a frame, and a way to get these notes back to JS. It's intended to be predictable and not too resource intensive. > > Comments are welcome; guidance to the WICG process as well - this is my first attempt to use this forum for an API proposal. > >
Received on Wednesday, 2 November 2016 18:19:46 UTC