Re: Touch and gestures events

On Thu, Oct 15, 2009 at 7:33 AM,  <kari.hiitola@nokia.com> wrote:
>
> Hi,
>
> I suppose that the interest Olli mentioned was ours (Nokia). Unfortunately
> there was a long delay before we were able to participate the discussion and
> release our implementation, but yes, we have previously discussed touch
> events with Olli. We would be interested in participating standardization of
> touch and gesture events.
>
> We have implemented touch and manipulate (including pan, scale and rotate)
> events on top of Qt 4.6 and WebKit, and the open-sourced results can be seen
> in http://opensource.nokia.com/Starlight .

What is that?

| Directly manipulated multitouch user interfaces, combined
| with rich visual content, are today's trend in mobile devices

Trend based programming is all the rage.

Working binaries for Windows 7
> (on Dell Latitude XT/XT2, some HP TouchSmarts), source, videos and some
> documentation can be found by following the links there. For the period when
> there is no existing standard, we also funded an event abstraction library

The decision came from (misguided individuals in) Nokia USA?

> in Script.aculo.us. Its release post is here:
> http://mir.aculo.us/2009/10/15/pinch-twist-zoom-bringing-multitouch-to-a-browser-near-you/

What problem does this solve?

>
> The approach that we had was a bit different from Olli’s, but I see a lot of
> merit in his proposal, especially in extracting the common parameters to
> PointerEvent. Naturally there are also some improvement ideas:
>
> In TouchEvent, instead of size there should be separate height and width,
> because touch area is not necessarily circle or square. Of course that is
> also an approximation, but in real life mostly accurate enough. Having a
> bitmap of the touch area is a natural next step, but I think it isn’t needed
> for now, and futureproofing usually fails miserably.
>
> It might require a bit too much violence to fit the attributes of pan, zoom,
> rotate, and swipe all to parameters direction and delta. We have panX/Y
> delta coordinates, scale factor, rotation angle, and the speed for each of
> them. Our event types are only manipulate start/move/end instead of telling
> in the event type what kind of change happened. I think that most of the
> time all of these happen in parallel, but I see a point in being able to be
> notified e.g. only when pan coordinates change.
>
> The manipulate/gesture naming is of course a matter of taste, but we wanted
> to give a name according to what is done (manipulate) instead of what secret
> handshake is needed to do it (gesture). This way the events are mentally
> less tightly bound to any UI technology. For instance, our implementation
> starts panning with one finger moving instead of requiring two fingers to be
> present to start what can be called a gesture. The same manipulate stream
> can be continued by inserting a second finger and rotating. In another UI
> style pan and rotate could be as well done by touching the screen and
> rotating a jog wheel.
>
> To mention some omissions of our implementation, there can only be one
> manipulate event stream happening any given time, and there is no
> touchover/touchout implemented.
>
> Let’s hope that now there could be a real discussion about the topic.

[snip]

A "real discussion"?

I would think that would include identifing problems and have them
post the problems with a few alternatives.

A productive discussion would include reasons and insight to the
design of touch event, including problem scenario and proposed
solution.

The one known existing api (Apple's) for touch event has not been
justified (and justificaiton for specific parts was directly asked, by
me, above, in this trhead).

Garrett

Received on Thursday, 15 October 2009 19:06:49 UTC