Re: Touch and gestures events

Hello List,
I subscribed just now, hope this will be in the correct thread...

> We have implemented touch and manipulate (including pan, scale and
> rotate) events on top of Qt 4.6 and WebKit, and the open-sourced
> results can be seen in http://opensource.nokia.com/Starlight

In order to do really interesting things with multi-touch, the
manipulate API is not enough, for example if applications want to detect
specific gestures (for example a 3-finger 3D rotate/move).
And with the current TouchEvent it would be quite hard to achieve this:
to have easy gesture detection developers need three things:

a "history"-function of the event: multi-touch gestures use a lot of
dragging. So it's interesting to know, where the path of a certain
finger started, and how it arrived at the current point. TouchEvents
should include a pointer to a previous TouchEvent (if applicable)
indicating the last position of the finger, effectively creating a
linked list, where the tail node is the "touch down"-event for that
finger, then the subsequent "touch move"-events until the "touch up"-event.

a list of current event chains: of course, the application could catch
the touch events and add the chain to a list on "touch down", remove it
on "touch up", but it would be quite work intensive, since, for optimal
comfort, one such "current touches"-list should exist for every element
that would receive the touch events (i.e. when hit).

some kind of "tagging": applications would watch for touch events, and
upon receiving them, analyze the "current touches"-list to detect
multi-finger gestures, like "pinch". For example, when exactly two
fingers are pressed down on an object, this is "pinch". Should their
traces accidentally leave the object, they should not be recognized as
some other gesture, thus the application would "tag" them, so they won't
show up in "current touches"-lists higher/lower on the hierarchy. Like
bubble/propagation-prevention, but for the whole chain of touch events
that describe the motion of the "tagged" finger.
(this could be a library function though)


The reason, why event-chaining should be in the DOM, and not in the
application, is consistency: there are different approaches to tracking,
one might just look for the trace which has its endpoint nearest to the
motion-event to be matched, or consider velocity and curvature of the
trace to resolve conflicts. (for example: many multi-touch frameworks
fail to track correctly two fingers describing an X-shape by doing > and
< motions and end up with / and \-traces)
But maybe that's the OS's or driver's job.



Sorry if my rambling was irrelevant, my last experience with multi-touch
is from 2007.

-- 
pascal germroth

Received on Monday, 2 November 2009 01:51:13 UTC