On Fri, Jun 24, 2011 at 12:00 PM, Vincent Scheib <scheib@google.com> wrote:
> I've added clarification to the draft spec in the use case section,
> "Touch screen device input
> All the application use cases are relevant on touch screen devices as well.
> A user should be permitted to make large gestures that are interpreted as
> movement deltas, without concern over the absolute positions of the touch
> points. Other UI elements from the user agent and system should be
> suppressed while under mouse lock.
>
> Absolute position of touch points should always be available, unlike mouse
> input where the concept of the cursor is removed. On a touch screen the
> absolute positions will always be relevant.
>
I don't believe it makes sense to have a "mouse lock" mode on a touchscreen
which causes touching UI elements to instead be sent as mouse events to the
window. That would be very confusing.
Touching in-window and then dragging out-of-window on a touchscreen is
useful, but you should be able to do that with regular mouse (or touch)
events already.
It can be used in fullscreen mode, where there are no UI elements, but in
that case it's not necessary--you can just use mouse events directly. The
main reason for mouse lock--the issue of the mouse being moved beyond the
edge of the screen--doesn't apply to touchscreens (you can't touch outside
the screen).
--
Glenn Maynard