- From: Nilsson, Claes1 <Claes1.Nilsson@sonyericsson.com>
- Date: Fri, 29 Jan 2010 15:08:56 +0100
- To: 'Max Froumentin' <maxfro@opera.com>
- CC: 'Robin Berjon' <robin@robineko.com>, "public-device-apis@w3.org" <public-device-apis@w3.org>
Hi Max, See my comments below. Regards Claes > -----Original Message----- > From: Max Froumentin [mailto:maxfro@opera.com] > Sent: onsdag den 27 januari 2010 11:43 > To: Nilsson, Claes1 > Cc: 'Robin Berjon'; public-device-apis@w3.org > Subject: Re: Publishing System Information API FPWD > > Hi Claes, thanks for the feedback. See below for responses. > > On 27/01/2010 09:34, Nilsson, Claes1 wrote: > > > Section 4.7 Network: > > > > The value range for currentSignalStrength attribute is undefined. > > Should it be a number between 0 and 1 representing a "minimum" and > > "maximum" signal strength level? > > True. Fixed. > > > Section 4.8 Sensors: > > > > - AmbientLight: The value range for the "intensity" attribute is > > undefined. Should it be a number between 0 and 1 representing a > > "minimum" and "maximum" intensity level? > > I sort-of agree. But then shouldn't we make all the sensors return a > normalised value, for the sake of uniformity? That wouldn't work so > well > for the atmospheric pressure, or ambient temperature, I think. > But currently there is no value range at all specified. > > - Proximity: How does a "proximity"-sensor work? If you hold the > > device in your hand the object nearest to the device is your hand. If > > you put the device on a table the nearest object is the table. If you > > have the device in your pocket the nearest object is the pocket. > > Which are the use cases? How to separate any other object from the > > object that "holds" the device? > > I don't really know, and I haven't done much research on proximity > sensors. Experts answers welcome! Proximity is tricky. It is probably difficult to achieve: " The distance from the device to the nearest object, as determined by this device's proximity sensors, in meters (m)". An object is always close to the device if we assume that it is not "flying in the air" :-) It is difficult, but might be possible, to differ the user's hand or pocket from any other object. The question is what do we want to achieve? Which are the use cases? There are many parameters to consider. E.g: Where is the sensor situated? In which direction does the sensor operate? Which type of sensor, granularity of values? Etc. I will discuss this internally with experts at SEMC and come back but your view on use cases we want to support would be valuable. > > > - Robin states below that W3C specification for FCWD should be > > "reasonably feature-complete". I am considering this. > > > > For sensors we have as I understand a resolution that "sensor that > > are often used by web applications should be easy to access by web > > developers" and be covered by the System Information API > > specification. Sensors that are more seldom accessed should be > > covered by a "full generic sensor API" that is subject to future > > work. OK. Looking at section 4.8 it covers "values of external > > sensors, reflecting the device's environment". The selection of > > sensors is consistent considering that they should represent the > > "device's environment". However, looking at this list with the view > > "sensors that are often used by web applications" then the priority > > might be different. I would say that common training use cases would > > motivate "Heart rate sensor" and "Step counter sensor" and maybe a > > "Blood pressure sensor". So, is the door open for any additional > > sensors, e.g. a set of "values of external sensors, reflecting the > > user's condition" due to the view "often used by web applications" or > > is the door closed? > > I for one consider that the door is open and I'm hoping for more input > on what sensors we should support. I don't think we would break the > "feature-completeness" of the specification if we had that discussion > after the FPWD is published and came to the conclusion that more > sensors > were needed. I will come back with proposals. > > > > Section 4.10 Storage: > > > > - The constants for type to be considered. Wouldn't it be interesting > > to separate built in RAM and memory card? > > Isn't that distinction properly captured throught the isRemovable > attribute? Yes, you are right. > > > - Why different data types for "capacity" (unsigned long) and > > "availableCapacity" (int)? > > Fixed, thanks. (aside: I'm not really sure why WebIDL defines more than > one integer type -except for unsigned- and, as a consequence, which to > use) > > > > > Section 4.12 Input Devices > > > > - Editorial error for attribute "microphones []" as it says "The list > > of cameras attached". > > Fixed. > > > - Editorial error for Camera property, attribute "maxZoomFactor" as > > it says " ...must be null is hasPhysicalZoom is false". > > Changed with "<dd>The maximum zoom factor of this camera. This value > MUST be <code>null</code> if the camera doesn not have a zoom (whether > physical or digital)</dd>" > > > Section A.2 Use Cases: > > > > - I miss a number of use cases. Some obvious use cases are for > > example: > [...] > > I've not touched that section recently, as I'm not sure it should be in > the document. Use-cases (and requirements) are helps, meant for writing > the specification itself, but not part of the specification. Examples > are ok, as they help the implementer get a quick overview of a specific > API. That's why I have one at the beginning of each section. But > use-cases are just as well replaced by more generic wording saying > "this > specification enables webapps to do XXX, YYY and ZZZ". I'd welcome a WG > decision on this, so I've opened ISSUE-70 > > http://www.w3.org/2009/dap/track/issues/70 > > Max.
Received on Friday, 29 January 2010 14:09:31 UTC