Re: Feedback on pointer event spec

On Sun, Apr 7, 2013 at 2:02 PM, Jacob Rossi <Jacob.Rossi@microsoft.com>wrote:

> On Thu, Apr 4, 2013 at 12:23 AM, Benoit Marchant <marchant@mac.com> wrote:
> >
> > Hi
> >
> > My name is Benoit Marchant, I lead the Montage framework development,
> which has been focusing on touch. Through conversations and some concerns
> we have , Doug suggested to share these directly on the list, so here it
> goes!
>
> Always glad to have framework developers participate and provide feedback.
> Welcome!
>

+1 - feedback from packages like Montage is incredibly valuable, thanks for
participating!


> > We understand that multi-touch will produce multiple pointer events, one
> for each moving finger in a single fingers position detection cycle. But we
> believe, there is something missing in the draft in order to detect
> multi-touch gestures. It's the need to know the events that are related to
> a single position detection cycle. For example, if you move two fingers at
> the same time, you would like to know both were moved at the same time, and
> not one after the other. And you also want to know which one is the last
> one within that cycle, so a timestamp is not enough, as you would not know
> which one is the last one in the cycle while you are receiving the events.
>

Just to make sure I understand, you want to know "which one is the last one
in the cycle" so you know when to start processing the gesture, right?
 There are other complications here.  Eg. in Chrome at least, we coalesce
movement events when there is contention on the main thread (little point
dispatching a move event when you know the pointer position has already
moved elsewhere).  So, depending on timing, we might coalesce an event for
one finger in a pair but not the other (on Linux/ChromeOS where I'm most
familiar - the XInput events come to us in a stream per touch, so we've
already lost any correlation between different touches the hardware had).
 I suspect this should be visible today with TouchEvents - if you drag two
fingers across the screen, you'll probably occasionally get a touchmove
event for only a single finger.  Do you expect this to be a problem for
code like what you have in Montage?  Isn't this something gesture handling
already needs to be prepared for?  eg. in a slow movement one finger may
not appear to move at all.

Perhaps a better model is to keep track of the active pointers as you get
events (manually today, but we expect to absorb this into the browser in
v2), and then rely on an animation loop with  requestAnimationFrame (or a
timer if a different frequency is desired) to periodically process the
current positions for the sake of gesture recognition?

A concrete example might help me understand the importance of this.  For
what sort of gesture would the distinction be significant?

 >
> > One proposition to address that would be for the primary pointer within
> a group (touch, for example), to be fired last within its finger detection
> cycle. That way one could store pointer events and then recognise gestures.
>
> I believe "position detection cycle" here refers to device frames, where
> each frame is a set of updates to all known pointers.  For what it's worth,
> this doesn't apply to all devices. Some devices actually report pointer
> data in a serial manner (rather than updates to all of them at once). This
> doesn't largely affect your scenario, as implementations can abstract away
> this underlying detail if needed. Just thought I'd mention it as it is a
> complexity implementers may need to consider in order to provide the
> illusion that 2 simultaneous moves could be reported as actually happening
> at the same time (e.g. in the same event).


> > Another way which looks like it was initially proposed by Microsoft,
> would be to have access to the pointer list ( msPointerList ). We may be
> missing something, but why was that removed leaving that gap?
>
> Some sort of pointer list API is on our list of V2 features [1]. There are
> multiple reasons it was removed from IE10:
>
> 1. We need to think more about how such an API works when you consider
> iframes (e.g. pointers should probably have an associated document and not
> be exposed to other documents through such list APIs)
> 2. In the IE10 Consumer Preview implementation, new pointers could
> actually show up in the list before you had received a pointer event for
> it. This often led to compat issues or unnecessary complexity for some
> scenarios.
> 3. Because multi-touch produces pointer events for each moving finger, a
> pointer list on the event itself encourages bad performance practices.
>  Specifically, even with just the Consumer Preview of IE10, we started to
> see a lot of:
>
> function handlePointerMove(ev) {
>         var pointers = ev.getPointerList();
>         for(var i=0; i<pointers.length; i++) {
>                 doSomethingPerPointer(pointers[i]);
>         }
> }
>
> So the code is getting a pointer event per pointer, and then for each
> event it is iterating over each pointer. This is a (perhaps not so obvious)
> N^2 algorithm. With devices that can report pointers at 200Hz, that's
> really bad.
>
> That said, there are good scenarios where you aren't doing something per
> pointer and are instead comparing the current set of pointers against
> themselves (gestures). Additionally, game loops can benefit from having a
> list of pointers rather than handling events for each. To address these
> issues, I personally think some sort of global pointer list API would be
> great (e.g. document.pointerList). We'll tackle this problem  in V2. In V1
> implementations, you can easily craft an array of pointers yourself [2].
>
> > Also, with pointers like 3D Pointers like kinect or LeapMotion, is there
> a reason why it's not included?
>
> There's a good thread on this that explain the reasons [3]. To summarize,
> * Even more so than the pointer list, 3D pointers need a bit more API
> exploration for use in web scenarios.
> * We want to prioritize getting the already common devices supported (e.g.
> touch).
> * We designed the V1 spec such that it is easily extended to support other
> devices. So this should enable exploration in implementations even before
> V2.
>
> > How would a mouse wheel with left/ right tilt +
> > rotation be represented in this spec?
>
>  These are still covered by the wheel event described in DOM Level 3
> Events (e.g. deltaX, deltaY) [4].
>
> -Jacob
>
> [1]
> http://www.w3.org/wiki/PointerEvents/UseCasesAndRequirements#Requirements:_Pointer_Events_v.Next_Specification
> [2]
> http://lists.w3.org/Archives/Public/public-pointer-events/2013JanMar/0205.html
> [3]
> http://lists.w3.org/Archives/Public/public-pointer-events/2013JanMar/0228.html
> [4] http://www.w3.org/TR/DOM-Level-3-Events/#events-WheelEvent
>
>
>

Received on Tuesday, 9 April 2013 16:38:37 UTC