- From: Benoit Marchant <marchant@mac.com>
- Date: Thu, 04 Apr 2013 00:23:31 -0700
- To: public-pointer-events@w3.org
- Cc: Doug Schepers <schepers@w3.org>, "montagejs@googlegroups.com" <montagejs@googlegroups.com>
Hi My name is Benoit Marchant, I lead the Montage framework development, which has been focusing on touch. Through conversations and some concerns we have , Doug suggested to share these directly on the list, so here it goes! We understand that multi-touch will produce multiple pointer events, one for each moving finger in a single fingers position detection cycle. But we believe, there is something missing in the draft in order to detect multi-touch gestures. It's the need to know the events that are related to a single position detection cycle. For example, if you move two fingers at the same time, you would like to know both were moved at the same time, and not one after the other. And you also want to know which one is the last one within that cycle, so a timestamp is not enough, as you would not know which one is the last one in the cycle while you are receiving the events. One proposition to address that would be for the primary pointer within a group (touch, for example), to be fired last within its finger detection cycle. That way one could store pointer events and then recognise gestures. Another way which looks like it was initially proposed by Microsoft, would be to have access to the pointer list ( msPointerList ). We may be missing something, but why was that removed leaving that gap? Also, with pointers like 3D Pointers like kinect or LeapMotion, is there a reason why it's not included? How would a mouse wheel with left/ right tilt + rotation be represented in this spec? Looking forward to hear your feedback about these questions and suggestions. Thanks! Benoit
Received on Thursday, 4 April 2013 07:25:07 UTC