- From: Bryan Sullivan <blsaws@gmail.com>
- Date: Wed, 3 Aug 2011 22:34:51 -0700
- To: Josh Soref <jsoref@rim.com>
- Cc: "Tran, Dzung D" <dzung.d.tran@intel.com>, "public-device-apis@w3.org" <public-device-apis@w3.org>
Josh, Dzung's use cases make perfect sense to me. Because some devices (OK, some "well-designed" ones) may support the system features you describe, does not mean that most devices do or that even in those devices that do, that the use case of screen brightness adjustment is irrelevant. I can think of various photo-viewing/editing use cases that might benefit from such controls, without requiring the user to go into a dark room or a bright room to cause the device to change its brightness (and especially since the main point is to affect the brightness, contrast, or other display aspects relative to the ambient light). We're not just talking about phones here either. Here are some other use cases for ambient light sensors which are used by webapps to control other devices or provide information about the user's environment: - ambient light sensors distributed around the house are monitored by a webapp, which displays the status of lights so prior to going to bed, the homeowner can turn those off that shouldn't be on, and those on that should. - an ambient light sensor is used in a webapp that can auto-activate an alarm system or take other automated actions when it gets dark, avoiding the need to manually adjust schedules for the time of year. - an ambient light sensor which can pickup a motion-sensitive outdoor light is used by a webapp to trigger activation of a camera when the light goes on in the middle of the night (note this is a great use case for an API to the motion sensor itself, but that is not currently proposed, and the ability to trigger events from different types of sensors can overcome limitations in hardware or available APIs). Bryan On Jul 26, 2011, at 3:27 PM, Josh Soref wrote: > Tran wrote: >> Incoming Call at the Theater >> >> Before the show began, the user has placed her cell phone in >> vibrate/silent mode to be polite to the other audience members. >> However, she needs to screen calls as they come because if the call is >> from the emergency room, she needs to leave the performance and take >> the call. As calls come in, the device senses that it is in a dark >> location, turns the backlight down as low as possible, and adjusts the >> color scheme to avoid bright colors. These adjustments make it so that >> the user eyes don't need to adjust; they also make the cell phone's LCD >> less distracting to other audience members. > > I don't consider this use case compelling. The system should (and does!) do this automatically for *all* applications. The user wants the behavior everywhere. Not just in the rare application written to take advantage of some obscure API. > >> Direct Sunlight or outdoor Use >> >> It's almost spring in the Northern Hemisphere, and the university lawns >> are packed with wishful thinkers trying to focus on homework while >> enjoying the warmth. The ambient light sensor detects the sun is >> shining and the application uses a color scheme with more contrast so >> that the users can see clearer in daylight. > > Transflexive displays [1] have been available since at least 2008. > >> Photo Editing Application adjusts display colors based on the color of >> the ambient light >> >> The user has just taken some photos and wants to upload them to a >> content sharing site on the internet. He navigates to the content >> sharing site's photo touch-up page, which uses the ambient light sensor >> to detect that the user is in a space with fluorescent lights and >> adjusts the users colors so that the picture colors will look correct >> on the majority of user's displays. > > This use case is somewhat contrived, but at least slightly less unreasonable than the previous one. However, even a normal person will understand that if you want decent color you're going to want to be in a place where you can see what you're doing. Auto color correction doesn't actually need a person to review it..., but if it did, the person would actually need to be able to see the colors clearly, and normal people understand that means going to a place where the sun isn't blanking color. If you try to do better than average, e.g. with Photoshop [2], then you're definitely going to want to see what you're looking at. > > Note that your average camera (or phone) has the ALS on one side of the device and the CCD/CMOS (or lens) on the other. Users are also not necessarily going to be experiencing the same lighting conditions as they had when they took their picture (it could be Paris where we were for DAP where Rain comes and goes swiftly). The proper way to detect light conditions is to read information from the picture itself. Someone has already asked the question recently [3] and got a couple of answers pointing to technology [4][5] which is on its way to solving it. There's actually some information available in Exif and some information can be calculated by calculating color balance. Oddly, you can generally just ask the user what kind of lights they're using. > > Now, it might be reasonable for the Camera itself to embed ALS information into its photos, that could save some effort. And perhaps that's reasonable feedback to be provided to Camera/Photo vendors. But the information can also be incredibly misleading. If I'm on one side of a glass window (inside a Train or the Eifel Tower) and I'm shooting at the other side, then the ALS on my cell phone will tell you about the interior whereas the CCD/CMOS will see the exterior and the information collected will be downright confusing. > > In short, I think the right thing here is for additional information to be recorded by the camera into Exif data. Yes, that means the user will have to have some way to strip this information (e.g. for privacy reasons), but that problem already exists. > > The only use I've actually seen for the ALS in a standalone application was an application that functioned as a light meter and tried to tell me what kind of light source was above my device. That wasn't compelling, although I was willing to try it for a bit. Note that I could have told my device the answer. And a simple UI would let me do a better job. In fact, your average camera *has* simple UIs for this purpose (even basic cameras). You can generally select "indoor" or "outdoor" or "cloudy". And users more or less are used to being given such options (although Camera UIs are not necessarily wonderful in this area, the options are at least familiar). > > > [1] http://en.wikipedia.org/wiki/Transflective_liquid_crystal_display > [2] > http://www.creativepro.com/article/out-of-gamut-don-t-underestimate-photoshop-s-auto-color > [3] http://photo.stackexchange.com/questions/12923/is-there-software-which-can-identify-the-lighting-in-an-existing-photo > [4] http://www.ri.cmu.edu/pub_files/2009/10/iccv09.pdf > [5] http://www.cs.sunysb.edu/~ial/content/papers/2011/panagop_cvpr11.pdf > > > --------------------------------------------------------------------- > This transmission (including any attachments) may contain confidential information, privileged material (including material protected by the solicitor-client or other applicable privileges), or constitute non-public information. Any use of this information by anyone other than the intended recipient is prohibited. If you have received this transmission in error, please immediately reply to the sender and delete this information from your system. Use, dissemination, distribution, or reproduction of this transmission by unintended recipients is not authorized and may be unlawful.
Received on Thursday, 4 August 2011 05:35:23 UTC