[webrtc-stats] Capture vs encode frameWidth, frameHeight and framesPerSecond (#422)

henbos has just created a new issue for https://github.com/w3c/webrtc-stats:

== Capture vs encode frameWidth, frameHeight and framesPerSecond ==
These currently exist in [RTCVideoHandlerStats](https://w3c.github.io/webrtc-stats/#dom-rtcvideohandlerstats). For frameWidth/Height, "width/height of the last processed frame for this track" it is not clear if "processed frame" of outbound media refers to frames before encoding (captured frames) or frames after encoding (i.e. frames sent). But framesPerSecond is clear: "For sending tracks it is the current captured FPS and for the receiving tracks it is the current decoding framerate."

We should have metrics for both captured and encoded frames. Related issues: [RTCMediaSourceStats](https://github.com/w3c/webrtc-stats/issues/400) and [Move all stream stats from the "sender"/"receiver" stats](https://github.com/w3c/webrtc-stats/issues/402).

**Proposal**
- For frames sent and received, we should have these metrics on a per-RTP stream bases (simulcast compatible).
- For frames captured, we should have these metrics on a "source stats" object.

Please view or discuss this issue at https://github.com/w3c/webrtc-stats/issues/422 using your GitHub account

Received on Wednesday, 3 April 2019 12:57:43 UTC