RE: [HighResolutionTime] accuracy

Jannis,

The working group had discussed your feedback in one of our past weekly conference calls. We felt that it was not a good idea to surface the complexities of different systems to the web platform by introducing an attribute that described the systems clock resolution. We also agree that due to hardware or software constraints, a UA may not be able to provide sub-millisecond resolution in all situations.

We have decided to recommend UAs should define DOMHighResTimeStamps as milliseconds accurate to a thousandth of a millisecond (microsecond). If a system cannot support sub-millisecond resolution, it should fallback to milliseconds units accurate to only the millisecond. We are hoping that this definition will encourage developers to rely on the microsecond resolution and know that the fallback will be no worse than what Date.now() provides.

Thanks,
Jatinder

-----Original Message-----
From: Jannis Froese [mailto:froese@lionservers.de] 
Sent: Thursday, March 22, 2012 4:36 PM
To: public-web-perf@w3.org
Subject: [HighResolutionTime] accuracy

 I am a java script developer and always like to follow the latest  advancements.

 The monotonic clock introduced by High Resolution Time is a huge step  forward and will eliminate many problems.

 The higher accuracy however is in many situations worthless if you  don't  know how accurate it really is. You can't define a reaction based on a  delay of for example more than a a twentieth of a millisecond if it  isn't  guaranteed that the clock is that accurate. Thus High Resolution Time  is in  many situations only usable up to the guaranteed accuracy of a tenth of  a  millisecond.

 Therefore I would propose to include an additional value stating the  timer  accuracy which the client can deliver. This would add flexibility and  remove unnecessary restrictions, particularly when the delivered  accuracy  improves as hardware improves.


 Regards

 Jannis Froese

Received on Wednesday, 11 April 2012 16:33:34 UTC