- From: Boris Zbarsky <bzbarsky@MIT.EDU>
- Date: Thu, 11 Oct 2012 00:19:29 -0400
- To: Ilya Grigorik <igrigorik@google.com>
- CC: public-web-perf@w3.org
On 10/10/12 11:42 PM, Ilya Grigorik wrote: > Interesting, thanks! I think the intuitive definition for 99.9% of > people who do not live inside the graphics stack is the last one that > you suggested, which is the "user sees something, anything...". Granted, > I think you're hinting at the fact that even that is not 100% accurate > due to hardware and other factors that are outside of our control Keeping in mind that I do _not_ live inside the graphics stack, so may be totally wrong about this, I suspect it's plausible to report the time of the first time you told your graphics hardware to actually go ahead and put stuff on screen. How much time passes between then and things actually appearing on screen is hard to determine, I suspect. See http://superuser.com/a/419167 for example. :( But again, this brings me back to the real question: why do people want this value? Whether it's OK to report misleading (due to hardware latency) things, and to what extent they can be misleading, really depends on the use cases people plan to put this to. Is the idea that changing a web page in a way that reduces this value is something to strive for? Is it just a feel-good kind of thing so we can say we're providing it? Something else? -Boris
Received on Thursday, 11 October 2012 04:19:58 UTC