Re: Touch and gestures events

Hi, Kari-

That's very cool stuff.  Thanks for letting us know about that... I hope 
that if we do indeed standardize multitouch and gesture events, you will 
lend us your practical experience in specifying them.

It's great that you've opened the source on the multitouch stuff... I 
don't suppose you have Mac binaries?  What sort of hardware is needed to 
support it?  Are there any restrictions on the OS level, or can you 
enable multitouch on any OS?

Regards-
-Doug Schepers
W3C Team Contact, SVG and WebApps WGs


kari.hiitola@nokia.com wrote (on 10/15/09 10:33 AM):
>
> Hi,
>
> I suppose that the interest Olli mentioned was ours (Nokia).
> Unfortunately there was a long delay before we were able to participate
> the discussion and release our implementation, but yes, we have
> previously discussed touch events with Olli. We would be interested in
> participating standardization of touch and gesture events.
>
> We have implemented touch and manipulate (including pan, scale and
> rotate) events on top of Qt 4.6 and WebKit, and the open-sourced results
> can be seen in http://opensource.nokia.com/Starlight . Working binaries
> for Windows 7 (on Dell Latitude XT/XT2, some HP TouchSmarts), source,
> videos and some documentation can be found by following the links there.
> For the period when there is no existing standard, we also funded an
> event abstraction library in Script.aculo.us. Its release post is here:
> http://mir.aculo.us/2009/10/15/pinch-twist-zoom-bringing-multitouch-to-a-browser-near-you/
>
>
> The approach that we had was a bit different from Olli’s, but I see a
> lot of merit in his proposal, especially in extracting the common
> parameters to PointerEvent. Naturally there are also some improvement ideas:
>
> In TouchEvent, instead of size there should be separate height and
> width, because touch area is not necessarily circle or square. Of course
> that is also an approximation, but in real life mostly accurate enough.
> Having a bitmap of the touch area is a natural next step, but I think it
> isn’t needed for now, and futureproofing usually fails miserably.
>
> It might require a bit too much violence to fit the attributes of pan,
> zoom, rotate, and swipe all to parameters direction and delta. We have
> panX/Y delta coordinates, scale factor, rotation angle, and the speed
> for each of them. Our event types are only manipulate start/move/end
> instead of telling in the event type what kind of change happened. I
> think that most of the time all of these happen in parallel, but I see a
> point in being able to be notified e.g. only when pan coordinates change.
>
> The manipulate/gesture naming is of course a matter of taste, but we
> wanted to give a name according to what is done (manipulate) instead of
> what secret handshake is needed to do it (gesture). This way the events
> are mentally less tightly bound to any UI technology. For instance, our
> implementation starts panning with one finger moving instead of
> requiring two fingers to be present to start what can be called a
> gesture. The same manipulate stream can be continued by inserting a
> second finger and rotating. In another UI style pan and rotate could be
> as well done by touching the screen and rotating a jog wheel.
>
> To mention some omissions of our implementation, there can only be one
> manipulate event stream happening any given time, and there is no
> touchover/touchout implemented.
>
> Let’s hope that now there could be a real discussion about the topic.
> Let’s not make premature conclusions about anyone’s patent applications.
>
> - Kari Hiitola
> The architect of Nokia’s Starlight project

Received on Friday, 16 October 2009 20:21:31 UTC