RE: [Sensors] Ambient Light Sensor use cases

Tran wrote:
> Incoming Call at the Theater
> 
> Before the show began, the user has placed her cell phone in
> vibrate/silent mode to be polite to the other audience members.
> However, she needs to screen calls as they come because if the call is
> from the emergency room, she needs to leave the performance and take
> the call. As calls come in, the device senses that it is in a dark
> location, turns the backlight down as low as possible, and adjusts the
> color scheme to avoid bright colors. These adjustments make it so that
> the user eyes don't need to adjust; they also make the cell phone's LCD
> less distracting to other audience members.

I don't consider this use case compelling. The system should (and does!) do this automatically for *all* applications. The user wants the behavior everywhere. Not just in the rare application written to take advantage of some obscure API.

> Direct Sunlight or outdoor Use
> 
> It's almost spring in the Northern Hemisphere, and the university lawns
> are packed with wishful thinkers trying to focus on homework while
> enjoying the warmth. The ambient light sensor detects the sun is
> shining and the application uses a color scheme with more contrast so
> that the users can see clearer in daylight.

Transflexive displays [1] have been available since at least 2008.

> Photo Editing Application adjusts display colors based on the color of
> the ambient light
> 
> The user has just taken some photos and wants to upload them to a
> content sharing site on the internet. He navigates to the content
> sharing site's photo touch-up page, which uses the ambient light sensor
> to detect that the user is in a space with fluorescent lights and
> adjusts the users colors so that the picture colors will look correct
> on the majority of user's displays.

This use case is somewhat contrived, but at least slightly less unreasonable than the previous one. However, even a normal person will understand that if you want decent color you're going to want to be in a place where you can see what you're doing. Auto color correction doesn't actually need a person to review it..., but if it did, the person would actually need to be able to see the colors clearly, and normal people understand that means going to a place where the sun isn't blanking color. If you try to do better than average, e.g. with Photoshop [2], then you're definitely going to want to see what you're looking at.

Note that your average camera (or phone) has the ALS on one side of the device and the CCD/CMOS (or lens) on the other. Users are also not necessarily going to be experiencing the same lighting conditions as they had when they took their picture (it could be Paris where we were for DAP where Rain comes and goes swiftly). The proper way to detect light conditions is to read information from the picture itself. Someone has already asked the question recently [3] and got a couple of answers pointing to technology [4][5] which is on its way to solving it. There's actually some information available in Exif and some information can be calculated by calculating color balance. Oddly, you can generally just ask the user what kind of lights they're using.

Now, it might be reasonable for the Camera itself to embed ALS information into its photos, that could save some effort. And perhaps that's reasonable feedback to be provided to Camera/Photo vendors. But the information can also be incredibly misleading. If I'm on one side of a glass window (inside a Train or the Eifel Tower) and I'm shooting at the other side, then the ALS on my cell phone will tell you about the interior whereas the CCD/CMOS will see the exterior and the information collected will be downright confusing.

In short, I think the right thing here is for additional information to be recorded by the camera into Exif data. Yes, that means the user will have to have some way to strip this information (e.g. for privacy reasons), but that problem already exists.

The only use I've actually seen for the ALS in a standalone application was an application that functioned as a light meter and tried to tell me what kind of light source was above my device. That wasn't compelling, although I was willing to try it for a bit. Note that I could have told my device the answer. And a simple UI would let me do a better job. In fact, your average camera *has* simple UIs for this purpose (even basic cameras). You can generally select "indoor" or "outdoor" or "cloudy". And users more or less are used to being given such options (although Camera UIs are not necessarily wonderful in this area, the options are at least familiar).


[1] http://en.wikipedia.org/wiki/Transflective_liquid_crystal_display

[2] 
http://www.creativepro.com/article/out-of-gamut-don-t-underestimate-photoshop-s-auto-color

[3] http://photo.stackexchange.com/questions/12923/is-there-software-which-can-identify-the-lighting-in-an-existing-photo

[4] http://www.ri.cmu.edu/pub_files/2009/10/iccv09.pdf

[5] http://www.cs.sunysb.edu/~ial/content/papers/2011/panagop_cvpr11.pdf



---------------------------------------------------------------------
This transmission (including any attachments) may contain confidential information, privileged material (including material protected by the solicitor-client or other applicable privileges), or constitute non-public information. Any use of this information by anyone other than the intended recipient is prohibited. If you have received this transmission in error, please immediately reply to the sender and delete this information from your system. Use, dissemination, distribution, or reproduction of this transmission by unintended recipients is not authorized and may be unlawful.

Received on Tuesday, 26 July 2011 22:28:07 UTC