- From: James Simonsen <simonjam@chromium.org>
- Date: Mon, 22 Aug 2011 16:07:32 -0700
- To: public-web-perf <public-web-perf@w3.org>
- Message-ID: <CAPVJQimCCQ4ywbiFQx4cDw9uFCjRc0Y6jtGF-cV5Q84zunDGhA@mail.gmail.com>
On Mon, Aug 22, 2011 at 3:54 PM, Zhiheng Wang <zhihengw@google.com> wrote: > However, looking longer term, there's a need for more precision. One >> example is graphics, where milliseconds are already insufficient for >> measuring frame rate. >> > > Do you have a more specific example? > If you measure frame rate by measuring the time between two subsequent frames at millisecond resolution, you will get 58 or 62, but not 60. > > >> Down the road, as games and apps get more sophisticated, we can expect >> people to want to time things within a frame. >> > > IIRC, 50 msec is the threshold for human to detect any latency at all in > FPS games. An app can still measure some other ops > inside it. But overall, I am still not sure why an application really cares > to know the exact sub-millisecond delay. > In order to display at 60 fps, an application must finish all of its work within 16.7 ms. If we're talking about something as complicated as a game, there's a lot of work to be done between frames. Physics, input, network, sound, AI, graphics, etc. I expect developers will need higher resolution than integer milliseconds for these. 1/16 is not much resolution. James
Received on Monday, 22 August 2011 23:07:57 UTC