W3C home > Mailing lists > Public > public-pointer-events@w3.org > October to December 2012

RE: Some comments about Pointer Events

From: Jacob Rossi <Jacob.Rossi@microsoft.com>
Date: Wed, 21 Nov 2012 00:16:07 +0000
To: Rick Byers <rbyers@google.com>, François REMY <francois.remy.dev@outlook.com>
CC: Pointer Events WG <public-pointer-events@w3.org>
Message-ID: <D0BC8E77E79D9846B61A2432D1BA4EAE13843F96@TK5EX14MBXC288.redmond.corp.microsoft.com>
I think it's a valid design that a more capable device A should be able to provide a higher pressure value than less capable device B (e.g. a value above 1, for instance).

I also think it's a valid design that UAs should be allowed to let users calibrate sensitivity.

However, here's where I differ:

François, you mentioned in your first mail that "I don't want the user to press 3 times more on the second one to achieve the same result." But if "normal pressure" is left up to the defining of the device or UA, then we don't achieve your goal (because "normal" could equate to different physical amounts). 

I don't like the notion of just leaving this unrestricted. Unrestricted means it's hard to test. Hard to test means it's hard for developers to use because they can't rely on predictable values. Consider a paint application that has 5 colors. The color of the ink is determined by the pressure of the stylus.  How do I determine the ranges to apply to each color if I can't be guaranteed a specific range of values (e.g. it might range 0-1 or it might range 0-2, as you indicated)?

So, I'd be fine if we wanted to define a "normal" pressure that more capable device could exceed. But then I think you need to know, per device, what the maximum possible value is.

That said, I'm intrigued by Rick's comment:
> I discovered [2] that it was possible to get values above 1 on Android.
I'm curious if that's just because Android uses Samsung's proprietary pen APIs (which indicated higher values were possible).  On Windows, the most popular devices are NTrig/Wacom digitizers I believe. I'm not aware that they have the same problem, but perhaps they do.  Let me check.  Regardless, I think *we* should solve this variance if at all possible. Otherwise, I think we're just punting the unpredictability to web developers. 

So I guess that makes 3 possible options:
1) Normalized as spec'd (assuming this is generally possible)
2) Physical units (no current indication that devices report this)
3) "Normal" pressure defined by the device/UA/calibration, but maximum possible pressure is also exposed


PS - This is probably my last mail until I return from vacation next week.  But please carry on discussion!

-----Original Message-----
From: Rick Byers [mailto:rbyers@google.com] 
Sent: Wednesday, November 21, 2012 12:22 AM
To: François REMY
Cc: Jacob Rossi; Pointer Events WG
Subject: Re: Some comments about Pointer Events

On Tue, Nov 20, 2012 at 5:02 AM, François REMY <francois.remy.dev@outlook.com> wrote:
> |  Can you give me a "for instance" for you proposed approach?
> I already did: The Samsung SPen Android APIs returns
>    - 0 when no contact
>    - 1 when normal pressure (or when there's contact with the finger)
>    - any positive number proportionnally to the normal pressure
> However, their documentation says that the "expected" range is usually 0-1 (but that if the user modifies the pressure settings it's possible to get the stylus to work in a [0-0.5] or a [0-2] range instead).
> So, to put things in perspective, it seems that the "normal value" of most styluses is their "maximal" value (unless they have been reconfigured to be less or more precise).
> My recommendation would then be to specify that the value returned by "pressure" is unrestricted (apart from being positive) but is -likely- to lay in the [0-1] range. However, some styluses + driver configurations may end up with results higher than 1.0 (for example if the user increased sensibility and didn't check 'clamp values at 1'). Maybe that use case isn't very well spread for now (and maybe some UAs will ignore values higher than one and clamp them) but at least an UA (that wants to support a specific kind of input device that can have a "hard pressure" mode) can do that and still respect the specification.

I support this notion - in particular that 1 indicates "normal" or "expected full" pressure but larger values are permitted.  I think it will be effectively impossible to achieve precise and consistent semantics across OSes / devices, but I'd still like to see UAs have some ability to try to provide a normalized value where there is at least two reference points (no pressure, nor normal/full pressure).

We hit this with the touch events spec on Android.  Touch events [1]
says: "a relative value of pressure applied, in the range 0 to 1, where 0 is no pressure, and 1 is the highest level of pressure the touch device is capable of sensing;".  I discovered [2] that it was possible to get values above 1 on Android.  This value comes from the Android MotionEvent.getPressure API, which the documentation states that the value "generally ranges from 0 (no pressure at all) to 1 (normal pressure), however values higher than 1 may be generated depending on the calibration of the input device.".  So there's no great way to fix this spec violation - either need a complex auto-normalization algorithm, or more likely just clamp discarding potentially useful information.

[1] http://dvcs.w3.org/hg/webevents/raw-file/tip/touchevents.html#widl-Touch-force
[2] https://bugs.webkit.org/show_bug.cgi?id=91799

> Also, I would prefer if a device that doesn't support pressure could return '1' instead of '0' when it's being pressed down. Alternatively, we could use a 'supportsPressure' property. Or we could do both.
Received on Wednesday, 21 November 2012 00:17:09 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 20:20:24 UTC