- From: jan-ivar via GitHub <sysbot+gh@w3.org>
- Date: Fri, 01 Sep 2017 19:36:49 +0000
- To: public-media-capture-logs@w3.org
> Very few apps should be setting min rates. 24 fps means very close to 1/24 s between frame If they're very few, I'm inclined to advise them to measure the frame rate for best result. I think there's also value in constraining UAs from picking a lower *target* frame rate *on purpose*. I fear more apps use that definition. AFAIK no browser returns actual frameRate in `getSettings()` today. Nit: `min` is a limiter without gravity, so 1/24 is no guarantee. Maybe `{min: 24, ideal: 24}`. My point is setting constraints properly is hard. `{exact: 24}` would be a recipe for failure with that definition. > If the app is OK with data below that rate it would not have set a min rate I think that's an assumption. It's hard enough to pick a *target* frame rate in gUM. I could easily see the majority of apps meaning, don't give me a *target* frame rate lower than 24. People may also be intuitively using `exact` as an enforcer, to force rescaling (as in "I don't care about native modes. Just rescale me this."). That's a model we're [considering](https://bugzilla.mozilla.org/show_bug.cgi?id=1388667#c10) as a compromise, since the alternative leaves few options for navigating the driver mode space. Again, not sure if my new MBP is broken, but I consistently see rates ~15-30 fps varying second by second, with both [gum](https://jsfiddle.net/jib1/suLvcr3a/) and [encoded](https://jsfiddle.net/jib1/6j072zcx/) on both FF and Chrome. These fiddles would both fail instantly with that definition. -- GitHub Notification of comment by jan-ivar Please view or discuss this issue at https://github.com/w3c/mediacapture-main/issues/466#issuecomment-326666892 using your GitHub account
Received on Friday, 1 September 2017 19:36:47 UTC