- From: Rob van Eijk <rob@blaeu.com>
- Date: Thu, 11 Oct 2012 23:50:21 +0200
- To: <public-tracking@w3.org>
Let's talk more about harm (my_def:= violation of the user's privacy) in the context of the permitted uses together with safeguards (no secondary use, data minimization and transparency, reasonable security, no personalization) to reduce the impact of the permitted uses on the privacy of the user. Shane already adressed proportionality. We talked about subsidiarity, as in browser based privacy friendly alternatives before at length, so I will not touch on those now. We also heard that in the EU storing/reading unique identifiers to/from the browser without explicit consent is a violation of the user's privacy. We also talked about the mis-use of the term anonymous data. In my view, unique identifiers that can be used to follow user behavior across sites MUST be treated as personal data and a company MUST make representations accordingly. To me harm can be approached the same as risk. So mitigating harm is based on reducing the chance of harm and/or reducing the impact on the user's privacy. Some elements that IMHO lead to harm that I would like to bring to the table are: - de-contexualisation: data about the behavior of a user collected in one context MUST NOT be applied in other contexts.; - information dis-balance: it MUST be disclosed to the user which specific categories of (online/offline/inferred) data were used to display a specific ad; - intransparent monetisation and customer value: it is often not clear what the business proposition is, let alone what value the user represents. This SHOULD be part of the informed choice the user gets; - data retention: data minimization MUST include data retention. Also, if a compare-and-forget alternative works for a permitted use, this solution MUST be the preferred choice for implementation. Rob
Received on Thursday, 11 October 2012 21:50:51 UTC