W3C home > Mailing lists > Public > public-device-apis@w3.org > February 2010

Re: [sysinfo] SEMC comments (was RE: Publishing System Information API FPWD)

From: Max Froumentin <maxfro@opera.com>
Date: Mon, 01 Feb 2010 11:34:04 +0100
Message-ID: <4B66AE1C.6000403@opera.com>
To: "Nilsson, Claes1" <Claes1.Nilsson@sonyericsson.com>
CC: "'Robin Berjon'" <robin@robineko.com>, "public-device-apis@w3.org" <public-device-apis@w3.org>
On 29/01/2010 15:08, Nilsson, Claes1 wrote:

>> I sort-of agree. But then shouldn't we make all the sensors return
>> a normalised value, for the sake of uniformity? That wouldn't work
>> so well for the atmospheric pressure, or ambient temperature, I
>> think.
> But currently there is no value range at all specified.

Yes, that could be added easily (and I've just done it), to all the
values where it made sense, i.e. all the sensors except ambient light,
since you can't put a unit on ambient light, given that different
sensors wouldn't measure the same value because of their different
locations on the device.

>>> - Proximity: How does a "proximity"-sensor work? If you hold the
>>> device in your hand the object nearest to the device is your
>>> hand. If you put the device on a table the nearest object is the
>>> table. If you have the device in your pocket the nearest object
>>> is the pocket. Which are the use cases? How to separate any other
>>> object from the object that "holds" the device?
>> I don't really know, and I haven't done much research on proximity
>> sensors. Experts answers welcome!
> Proximity is tricky. It is probably difficult to achieve: " The
> distance from the device to the nearest object, as determined by this
> device's proximity sensors, in meters (m)". An object is always close
> to the device if we assume that it is not "flying in the air" :-) It
> is difficult, but might be possible, to differ the user's hand or
> pocket from any other object. The question is what do we want to
> achieve? Which are the use cases? There are many parameters to
> consider.  E.g: Where is the sensor situated? In which direction does
> the sensor operate? Which type of sensor, granularity of values?
> Etc.
> I will discuss this internally with experts at SEMC and come back but
> your view on use cases we want to support would be valuable.

Thanks. The only use-case I can think of, right now, is to find out 
whether a phone is being help next to someone's head, which indicates 
that they are calling. On the iPhone, it turns the display off to avoid 
the user's ear accidentaly pressing buttons.

Received on Monday, 1 February 2010 10:34:39 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:32:17 UTC