- From: Rigo Wenning <rigo@w3.org>
- Date: Sun, 09 Jun 2013 05:32:58 +0200
- To: public-privacy@w3.org, Frederick.Hirsch@nokia.com
- Cc: Nicholas Doty <npdoty@w3.org>, public-device-apis@w3.org
Hi Frederick, I have one further remark on the proximity privacy considerations. For the moment, the fingerprinting effect is mostly put forward. IMHO, an even bigger risk of proximity events is that the location of the object in proximity is known and thus the application or service may derive location information from the presence of the object. This is very sensitive in many directions. On the one hand it may be a privacy risk for the user that he is proven to be in a certain location and we should alert application developers that sharing this location information may be very problematic. On the other hand, the user's presence in some location may be a significant defense if being accused of things that happened at the same time at a different location. As usual, the remedy is to put things under user control. This means that it is IMHO not an issue to store the location information of the proximity event into the device itself. But if location sharing is desired, the usual precautions should be taken, namely user agreement and some level of indication of sharing in the user interface. --Rigo On Sunday 19 May 2013 16:01:36 Nicholas Doty wrote: > Thanks, Frederick, for drafting this text and turning the Privacy > Interest Group comments and questions into coherent specification > text. I had a few comments which I've included inline below. At a > high-level, I think my questions are: > > * should this be entirely informative or is there a place for > normative text? * should the specification describe any concept of > "personal information"? * which considerations are intended for > implementers vs. application developers? * is ranking of different > fingerprinting techniques useful? > > CCing public-privacy both because they might want to see the specific > results of a privacy review and they may have insight into these > questions. > On May 9, 2013, at 11:22 AM, Frederick.Hirsch@nokia.com wrote: > > I have drafted proposed text to add to the currently empty Security > > and Privacy considerations section of the Proximity Events [1] & > > Ambient Light Events [2] specifications. > > > > This proposal is based on feedback from the privacy interest group > > (PING) [3], [4], [5]. > > > > The following proposed text is common to both specifications, apart > > from the part marked [SPECIFIC] which should be replaced with the > > specific text that follows. > > > > Proposed Common text: > > --- > > > > 4. Security and Privacy Considerations > > > > This section is informative. > > > > This specification does not process or link to personal information. > > I'm not sure the concept of "personal information" is well-defined, > and in this case I'm not sure how useful it is. During the review it > was occasionally speculated that there may be contexts where the > ambient light or proximity revealed information about a person they > might want to keep private -- does that make information "personal"? > Furthermore, I would suggest that such a sentence is not particularly > useful to the reading audience. > > Privacy threats can arise when this specification is used in > > combination with other functionality or when used over time, > > specifically with the risk of correlation of data and user > > identification through fingerprinting. Application developers > > should consider how this information might be correlated with other > > information and the privacy risks that might create. The potential > > risks of collection of such data over a longer period of time > > should also be considered. > By "application developers", do you mean implementers (like browser > vendors) of this API? Or JavaScript web app authors who will make use > of the API? I think the question of correlation risk over time is one > for implementers -- application developers might take advantage of > that correlation, but it wouldn't be an inadvertent outcome that they > would need to consider for downstream use, I don't think. > > [SPECIFIC] > > > > If the same Javascript code using the API can be used > > simultaneously in different window contexts on the same device it > > may be possible for that code to correlate the user across those > > two contexts, creating a new kind of tracking 'bug'. > Rather than 'bug', I would say "a new kind of unexpected correlation, > which could be used for tracking a user's activity". Furthermore, > this seems like an area where we might usefully add normative text: > > Implementations SHOULD NOT fire [ambient light / proximity] events in > multiple browsing contexts. For example, a mobile device might only > fire proximity change events for the active "tab". > > Normative text might be appropriate here because this would > standardize a mitigation of the privacy threat for all > implementations and give consumers of the API clarity that they > shouldn't expect background tabs to receive such events. > > Implementations should consider providing the user an indication of > > when the sensor is used and allowing the user to disable sensing. > > > > Application developers that use this specification should perform a > > privacy assessment of their application taking all aspects of their > > application into consideration. > While I think this is sound advice for application developers > generally, is it productive to add to this section of this > specification? > > --- > > > > [SPECIFIC] to be replaced with the following for Proximity Events: > > > > Variations in implementation limits of minimum and maximum sensing > > distance as well as event firing rates offer the possibility of > > fingerprinting to identify users, although this threat is > > relatively low considering the availability of other simpler > > fingerprinting possibilities. Implementations may reduce the risk > > by limiting the granularity and event rates. > Is it important to describe the fingerprinting threat as low because > of other fingerprinting possibilities? I am particularly concerned > because if every specification includes this caveat, it could give > the impression that no fingerprinting mitigations are ever worth > pursuing. (There are some who have advocated for accepting that > outcome, but it seems to be an open and changing question.) If some > of the simpler fingerprinting techniques were mitigated in other > specs or common browser implementations, would we need to update this > spec to note that this fingerprinting technique is now a relatively > large threat? I'm also not sure that the judgment of the relative > threat of fingerprinting in implementing this specification is > important to the implementing audience. > > Thanks, > Nick
Received on Sunday, 9 June 2013 03:33:30 UTC