Re: [touch-events] Why don't Touch objects have offsetX and offsetY properties like MouseEvent does?

Well, in case it gives a bit more background about why I really want 
this particular feature:

I have a matrix3d-transformed element depicting the screen of a 
handheld device, with perspective. I want some interaction on that. I 
want the user to be able to click/tap/swipe the screen and for it to 
react as you'd expect. In order to determine where on the display the 
user has pointed, I need to get from screen/page coordinates back to 
the local coordinates of the transformed element. That's what 
`offsetX` and `offsetY` give me, but to get `client*` or `page*` or 
whatever the other ones are (these are all I have currently if I want 
to get them from touch events) and turn them back to the local 
coordinates this is hard, at least as far as I can tell. I managed it:
 my solution involves getting the inverse of the matrix transformation
 and multiplying this with the screen/page coordinates. But what a 
mess, and I certainly wouldn't expect front-end developers without a 
programming or mathematics background to figure out how to do this.

My solution in fact is a little buggy (there's some error in the 
calculated position, and interestingly it looks like Firefox's 
implementation of `offset*` may have the same bug as whatever is in my
 code, while Chrome's does not), and I'd absolutely love to know what 
I've done wrong. So in case anyone's interested, I have [an open Stack
 Overflow question here](http://stackoverflow.com/q/36373114/496046) 
(including a [jsfiddle of the 
problem](https://jsfiddle.net/gq1vLaxk/5/)), and I also [reported what
 I think is a bug to Firefox 
here](https://bugzilla.mozilla.org/show_bug.cgi?id=1261645).

-- 
GitHub Notification of comment by tremby
Please view or discuss this issue at 
https://github.com/w3c/touch-events/issues/62#issuecomment-205218599 
using your GitHub account

Received on Monday, 4 April 2016 09:50:37 UTC