Re: Touch and gestures events

Hi,

I suppose that the interest Olli mentioned was ours (Nokia). Unfortunately there was a long delay before we were able to participate the discussion and release our implementation, but yes, we have previously discussed touch events with Olli. We would be interested in participating standardization of touch and gesture events.

We have implemented touch and manipulate (including pan, scale and rotate) events on top of Qt 4.6 and WebKit, and the open-sourced results can be seen in http://opensource.nokia.com/Starlight . Working binaries for Windows 7 (on Dell Latitude XT/XT2, some HP TouchSmarts), source, videos and some documentation can be found by following the links there. For the period when there is no existing standard, we also funded an event abstraction library in Script.aculo.us. Its release post is here: http://mir.aculo.us/2009/10/15/pinch-twist-zoom-bringing-multitouch-to-a-browser-near-you/

The approach that we had was a bit different from Olli's, but I see a lot of merit in his proposal, especially in extracting the common parameters to PointerEvent. Naturally there are also some improvement ideas:

In TouchEvent, instead of size there should be separate height and width, because touch area is not necessarily circle or square. Of course that is also an approximation, but in real life mostly accurate enough. Having a bitmap of the touch area is a natural next step, but I think it isn't needed for now, and futureproofing usually fails miserably.

It might require a bit too much violence to fit the attributes of pan, zoom, rotate, and swipe all to parameters direction and delta. We have panX/Y delta coordinates, scale factor, rotation angle, and the speed for each of them. Our event types are only manipulate start/move/end instead of telling in the event type what kind of change happened. I think that most of the time all of these happen in parallel, but I see a point in being able to be notified e.g. only when pan coordinates change.

The manipulate/gesture naming is of course a matter of taste, but we wanted to give a name according to what is done (manipulate) instead of what secret handshake is needed to do it (gesture). This way the events are mentally less tightly bound to any UI technology. For instance, our implementation starts panning with one finger moving instead of requiring two fingers to be present to start what can be called a gesture. The same manipulate stream can be continued by inserting a second finger and rotating. In another UI style pan and rotate could be as well done by touching the screen and rotating a jog wheel.

To mention some omissions of our implementation, there can only be one manipulate event stream happening any given time, and there is no touchover/touchout implemented.

Let's hope that now there could be a real discussion about the topic. Let's not make premature conclusions about anyone's patent applications.

 - Kari Hiitola
The architect of Nokia's Starlight project



On 10/14/09 3:40, "ext Garrett Smith" <dhtmlkitchen@gmail.com> wrote:

On Wed, Jun 17, 2009 at 1:22 PM, Olli Pettay <Olli.Pettay@helsinki.fi> wrote:
> Hi all,
>
> there seems to be some interest to standardize touch and gesture events.

Please provide links to where interest was indicated.

Especially valuable are where mobile companies have provided
information about the hardware technology and how it relates to the
touch evetns API.

> They are becoming more common on OS level: Windows 7 has them, iPhone has
> them, OSX has at least gestures, X is getting support for them etc.
>

Can Touch and Gesture events be separated, or are they necessarily coupled?

> So perhaps there should be a new spec for touch and gesture events.
> Hopefully a pretty simple spec.

Would it be simpler to have a touch events as a stand-alone standard?

>
> It is not yet quite clear what the events should look like, but maybe
> something like this:
>
> interface PointerEvent : UIEvent {
>        readonly attribute long streamId;
>
>        readonly attribute long screenX;
>        readonly attribute long screenY;
>
>        readonly attribute long clientX;
>        readonly attribute long clientY;
>
>        readonly attribute boolean ctrlKey;
>        readonly attribute boolean shiftKey;
>        readonly attribute boolean altKey;
>        readonly attribute boolean metaKey;
>        void initPointerEvent(...);
> };
>
> interface TouchEvent : PointerEvent {
>        readonly attribute float pressure;
>        readonly attribute long size;
>        void initTouchEvent(...);
> };
> Touch Event types: touchdown, touchmove, touchup, touchover, touchout
>

A blatant violation of US Patent: US20090225039.


>
> Touch Event handlers might want to know all the current touch positions, but

The Event instance does not need to know the current touch positions,
but the callback does. The current event positions could easily be
stored in a place where they can be polled, either on the Event, a
View, or some other object.

> even then, all the events should be dispatched, so that
> events propagate properly also for example through (XBL2) anonymous content.

For those of us who are unfamiliar with how events are propagated with
XBL2 anonymous content, can you please explain?

> Perhaps defaultView (window) could have a property which lists
> current touches.

This allows the callback to check that property (currentTouches, or
whatever), on window.

(I don't like iPhone's touch events which contain all sort
> of list of touches.)
>
No, fortunately that API is patented so it won't be usable (and hence,
will not be used as a public standard).

The legality of that Patent seems questionable, however. Itself
borrows from existing D3E API and idiom, so it is not completely
original.

> Gesture Events have the problem where to dispatch the events. Should
> gestureStart be dispatched to the same node as gestureEnd? What if the node
> is removed, or moved to another document? User is still doing the gesture,
> so the event should be dispatched to somewhere. Where should other gesture
> events be dispatched? Whatever is "under the gesture"?
>
> Any comments, ideas?
>
Apple has patented Touch events under US Patent: US20090225039.

I do not mind the innovation so much, but the fact that it is
proprietary leads to difficulty for cross-device scripting. If the
proliferation of touch screen devices were matched 1:1 with
proprietary touch event API, the result will be greater divergence
than was seen during the Great Browser Wars.

(I see Palm now has constant Event.MOUSE_DRAG).

Has it been so long since the Browser Wars of IE and Netscape that the
lesson has been forgotten? Were the consequences not understood deeply
enough?

Perhaps Maciej or other Apple developers can comment on some of the following.

* initTouchEvent takes 18 parameter variables. Three of these
variables are "TouchList" variables. Each TouchList is comprised of
one or more Touch instances. Was a simpler API considered?

* the TouchEvent payload for the eventListener does not have the
interesting data itself. Instead, the interesting data is on the
"changedTouches" TouchList. The callback must address this complexity
by hanling not only a touch event, but a touch event which is not
directly useful (the information the program needs is in the touches
or changedTouches).

I would like to see input from Palm, RIM, Opera, and other developers
of mobile browsers and touch screen devices on this thread.

Garrett

Received on Thursday, 15 October 2009 17:09:07 UTC