Re: Indirect touch events?

On Aug 17, 2011, at 7:17 AM, Matt Brubeck wrote:
> On 08/16/2011 11:07 AM, Nathan Vander Wilt wrote:
>> What is the plan for addressing indirect touch events — i.e. those that are not received relative to screen/client coordinates, but relative to the input device itself?
> 
> We haven't really discussed this yet.  Thanks for bringing it up.
> 
>> At a low-level, it's just a matter of giving JavaScript touch events with deviceX/deviceY properties, preferably normalized to the [0-1] range, as well as the same gesture info as direct manipulation. But at the spec level, there needs to be some way of handling and communicating this distinction.
> 
> Yes, I see.  Can you sketch out a couple of use cases to illustrate what developers might do with these events?

Well, for me personally the biggest use case is map navigation. Scrolling should pan, and pinching should zoom. Until higher-level gesture support is specified, it is fairly simple to calculate based on the raw touch data so long as it is provided and able to ..preventDefault().

I'm also personally doing a lot with photo navigation and organization with HTML as the platform. The basic "light table" gestures of zooming and spinning photos to arrange them in a layout. Again, higher-level gesture events preferred, but access to the raw NSTouch-style data is preferable to none.

A third example would be 3D navigation, for example of a WebGL model. Here is where basic higher-level gestures would be less useful, and the raw touch gestures could be interpreted in a way that makes sense per-app. (For some early examples see Jef Han's demos, but I'm sure there's more experimentation that JavaScript developers would love to do.)

And finally, I'm certain that games and other media "surface editing" apps could make huge use of the raw indirect touch events in extremely creative ways. The desktop is not dead, and for some uses indirect touch may be a preferable interaction medium over direct "fingers covering the screen" manipulation.


> And can you point us to documentation for similar APIs in Mac OS X or elsewhere?

On Mac OS X, NSTouch events [1] provide a normalized finger position and also information about the input device size (perhaps most useful for discovering the aspect ratio). They are received through the NSResponder methods listed under "Touch and Gesture Events" section [2] of that documentation. I imagine providing indirect touches to in-browser code would look quite similar to the existing touch events this working group, but with the document-relative properties of every active touch reflecting the (mouse) *cursor* position and an additional floating point deviceX/deviceY properties that reflect each touch's actual normalized coordinates. (Normalized meaning, one corner of the device is at 0,0 and the opposite at 1,1 and the middle of the device at 0.5, 0.5.)

[1] http://developer.apple.com/library/mac/documentation/AppKit/Reference/NSTouch_Class/Reference/Reference.html
[2] http://developer.apple.com/library/mac/documentation/Cocoa/Reference/ApplicationKit/Classes/NSResponder_Class/Reference/Reference.html#//apple_ref/doc/uid/20000015-SW35

> Some tricky parts:  There's no natural mapping from these device coordinates to targets in a web page.  Touches on a touch pad have default actions like mouse movement and scrolling that already generate events and affect web pages; content will want to prevent these default actions, but we won't want to let content break users' experience too badly (e.g. by capturing the mouse and not letting go).  Content that uses touch events would often want to work with both "direct" and "indirect" touch events, so we should make it as simple as possible to use both.

Indeed, this is a key point and that's why I was moved to bring up the distinction on this list. With indirect manipulation you *can't* guarantee that clientX/clientY represent the position of actual "fingers", and providing enter/leave events is similarly problematic.

In the end, the overall event architecture is not significantly different (especially at the "gesture" level as you describe below) but at the lower level they are fundamentally unique models of interaction. Perhaps in certain apps the similar touch events could be treated in a similar way, but generally it would need to be implemented as progressive enhancement in three different directions: one for traditional mouse, one for indirect multitouch and one for direct multitouch. I just would be disappointed to see the middle one left out of reach of web developers!


> In Gecko, certain trackpad touch gestures are translated into high-level "intentional" events like MozMagnifyGesture and MozRotateGesture. (Currently I believe these are exposed to Gecko "chrome" but not to web content.)
> https://developer.mozilla.org/En/Mouse_gesture_events
> 
> This working group has plans to standardize intentional events like these, although we haven't started writing a spec yet.  The high-level events may be better for most common use cases than low-level events for lots of different input modes.

It appears from my limited research that Adobe Flash has chosen to only provide *gesture* information (and not touches) in the case of indirect multitouch:
http://help.adobe.com/en_US/FlashPlatform/beta/reference/actionscript/3/flash/ui/Multitouch.html Leaving out direct touch information is unfortunate for several — but certainly not all — of the use cases described above.

I think your intuition is right that high-level events will be more widely useful. Of course, helper libraries can spring up in the meantime to shim up to this level given access to the low-level events.


>> Can compatibility indirect touch events be opened as an issue on this spec, please?
> 
> Yes, I'll open an issue based on responses to this thread.

Thanks! To reiterate, I'm not asking for this working group to find a magic conversion from indirect touch events into direct — just a way for web application code to receive and handle modern trackpad data.

regards,
-natevw

Received on Friday, 19 August 2011 06:23:53 UTC