Re: User interaction events

On Wed, Apr 16, 2014, Anne van Kesteren <> wrote: 

> On Wed, Apr 16, 2014, Shane McCarron <> wrote:
>> Perhaps I am wrong, but isn't the IndieUI activity working on this sort of stuff right now? 

Only partially. Anne is asking about physical events. IndieUI is about allowing the UA, OS, or other intermediary to abstract physical events into meaningful semantic events, and send those semantic events to the web application. There is some overlap with the "order of operations" comment, which is currently mentioned in the IndieUI Events spec a few times as a TBD. I'd encourage the TAG to include the IndieUI Events module in this review. There are still big chunks missing, but we'd be happy to answer any questions.

Here's the current editor's draft:

> Not really, they're basically adding a layer of difficulty in figuring out what it all means later.

I don't think "adding a layer of difficulty" is a fair synopsis. If anything, IndieUI Events is intended to reduce authoring complexity, because most web authors do custom physical event interactions so poorly now, and neither DOM Events nor Pointer Events sufficiently address all types of physical events. These specs can't, because not all physical event interactions have been invented yet.

For example, if a web author implements a custom scroll view (like a 2.5D circular carousel), he or she currently has to register for and respond appropriately to:

 Mouse events (3 or more types)
 Touch events (3 or more types)
 Keyboard events: (arrows, pageup, pagedown, home, end, spacebar, etc.)
 Scroll change (on a backing native view) and/or wheel events.

Most web authors don't make it this far correctly, but even if they do, it's a ridiculous level of complexity, currently requiring 8+ event handlers. Even if some of those pointer-like events are eventually combined, the custom scroll view still doesn't handle "scrolls" triggered by unknown physical events, such as:

 Speech control (e.g. "scroll down")
 Unspecified mainstream physical interfaces (e.g. Kinect gestures)
 Some events triggered by assistive technology (e.g. screen readers usually set the scroll position of native scroll view programmatically)

These custom scroll views aren't even programmatically detectable as scroll views. Even if they were, the only way a custom view could be controlled by an alternate physical interface would be to simulate physical events. That approach is clunky, unpredictable, and error-prone.

IndieUI Events specifies a way for elements to respond to specific types of semantic interaction patterns (e.g. "I act like a scroll view and accept 'scroll request' events, regardless of the physical user interface used to trigger scrolling.") The event "receiver" needs to be declared so the UA can prevent blocking events on ancestor or sibling elements. (e.g. If a custom scroll view was a descendant of a native scroll view, we'd still want hardware-accellerated scrolling to work on the native scroll view, even if the event listener is delegated to the body element.) 

The "scroll request" event object would contain origin and deltas for continuous scroll events or keyword-based indicators of discrete events. The web application controller can handle these events using the properties associated with the event object. IndieUI Events covers existing physical interactions that are currently impossible to handle, and it covers future physical interactions that don't yet exist.

IndieUI also reduces the current authoring complexity of continuous events from 8 or more physical events to 4 semantic events. For example, with the custom scroll view:

 scrollrequest for the discrete events (e.g. "down", "pageDown", etc.)
 scrollstartrequest, scrollchangerequest, and scrollendrequest for the continuous event (origin x/y, delta x/y).

In the case of other types of discrete events (e.g. escape from the current view, delete this item, etc.) it reduces the current number of event handlers from 2 (sometimes 3) down to 1.


Received on Wednesday, 16 April 2014 20:19:38 UTC