Re: Touch and gestures events

> I am aware that Apple has rotation and scaling. Must these necessarily
> be coupled to the input device ("touch")? Could the rotation/scale be
> independent of that (much like a contextmenu event is independent of
> the input device or method that that device triggers it (recall old
> mac IE, where click-hold caused a context menu to appear).
>
> Is it possible to, as RIM does, map mouse events to touch events?
>
> If so, I would like to propose that if finger movements are mapped to
> scroll the document, that this is mapped to the default action for a
> "mousemove" event. For example:-

I've outlined this my previous email.

Example: if you press, drag and lift your finger, you produce a set of  
events. If you get panning, a text selection, or a drag & drop action,  
that just the default action of that gesture that the user agent  
implements regarding its use cases and target devices. Gestures themselves  
are not what should be specified, because gestures are like keyboard  
shortcuts, there are almost infinite possibilities.

What you want are the UI events for panning and zooming, and other UI  
actions that might be triggered by gestures. Those events should indeed be  
supported, just like focus & blur for instance, but completely decoupled  
 from a touch screen device, because those can be triggered by other means.  
For instance, desktop browsers allow you to scroll a document obviously,  
and that scroll event can be extended to the panning action one gets on  
mobile browsers, because panning and scrolling are the same thing. Many  
popular desktop browsers also support full page zooming but I doubt they  
fire events when the user chooses a different zoom level. Opera for  
desktop has always had mouse gestures, and there are many Firefox  
extensions that add gestures support. Safari on Mac OSX 10.6 also support  
gestures I think.

So, all these features should not be bound in any way to touch devices  
alone.

Received on Friday, 16 October 2009 23:08:21 UTC