- From: Robin Berjon <robin@berjon.com>
- Date: Wed, 9 May 2012 15:19:28 -0700
- To: Marcos Caceres <w3c@marcosc.com>
- Cc: "public-device-apis@w3.org public-device-apis@w3.org" <public-device-apis@w3.org>
On May 9, 2012, at 14:16 , Marcos Caceres wrote: > On Wednesday, May 9, 2012 at 10:13 PM, Tran, Dzung D wrote: >>> So, what I agreed with jonas about was a new event that only fired when there was a transition between near and far. device proximity for something that was more advanced. <insert this new event name here> for something really simple. >> >> What I am afraid of is that browsers going to interpret far versus near differently on the same device. I rather to give the control to the programmer to interpret base on value, min, max. > > I'm afraid of the opposite thing. I more trust the people that are closer to the os (or directly interfacing with the hardware) to handle that. The OS knows the sensor's calibration best — it ought to be able to give you near/far events directly. That's how things work on iOS where you get a proximityState boolean when something is close to the device. Android does it more like Doug's proposal. If the use case is just detecting proximity then I prefer the iOS approach — the OS will know better, and the developer doesn't need to know more. If we're trying to do a generic distance sensor then we're missing at least a field to indicate multiple sensors triggering (which is not common on phones, but I believe is the norm on cars — yes, we have to get used to thinking about those too I'm afraid :). I'm not convinced that we should try to merge the two use cases into a single approach. -- Robin Berjon - http://berjon.com/ - @robinberjon
Received on Wednesday, 9 May 2012 22:19:54 UTC