- From: Rick Byers via GitHub <sysbot+gh@w3.org>
- Date: Tue, 28 May 2019 14:34:05 +0000
- To: public-pointer-events@w3.org
I agree the spec doesn't require this, so relaxing the tests seems fine. That said, a couple years ago I believe we did [get agreement](https://github.com/whatwg/dom/issues/23#issuecomment-282708277) that `Event.timeStamp` was the right way to [measure input latency](https://developers.google.com/web/updates/2016/01/high-res-timestamps). So it would seem confusing to me for a `pointerdown` to have a different start time from it's equivalent `mousedown` - it would suggest that one or the other doesn't really reflect the real input latency. Isn't there a risk that this could break sites own [measurements of their input latency](https://developers.google.com/web/updates/2018/05/first-input-delay)? It looks like the [FID polyfill](https://github.com/GoogleChromeLabs/first-input-delay/blob/master/src/first-input-delay.js) just pays attention to the first event it sees and listens for both `pointerdown` and `mousedown` so as long as WebKit uses the real original timeStamp for `pointerdown` it should be OK. But others may not be. -- GitHub Notification of comment by RByers Please view or discuss this issue at https://github.com/w3c/pointerevents/issues/284#issuecomment-496543262 using your GitHub account
Received on Tuesday, 28 May 2019 14:34:07 UTC