RE: Third-Party Web Tracking: Policy and Technology Paper outlining harms of tracking

Rob,

All due respect but you've defined the term of focus here ("harm") with ambiguous terms that are not well-defined "violation of user privacy" - so you've not really advanced the discussion.

To your other points/tests:

- de-contexualisation: already achieved in draft text as data would not be used outside of Permitted Uses and not to modify a user's experience.

- information dis-balance: working towards this goal with AdChoices Metadata (phase 2 of that project) which will carry more information about the ad and those involved with its targeting with the ad itself.  Not something I believe DNT resolves but industry is working towards a solid solution here.

- intransparent monetisation and customer value: I believe this is already evident as much research has already shown users understand that they are getting the ads because they're consuming free content.  The deeper area of contention is how much detail must a user understand - what is the threshold?

- data retention: data minimization already included in draft text.

So far I believe we're on track to meet your tests with the most recent draft.

- Shane

-----Original Message-----
From: Rob van Eijk [mailto:rob@blaeu.com] 
Sent: Thursday, October 11, 2012 2:50 PM
To: public-tracking@w3.org
Subject: Re: Third-Party Web Tracking: Policy and Technology Paper outlining harms of tracking

Let's talk more about harm (my_def:= violation of the user's privacy) in the context of the permitted uses together with safeguards (no secondary use, data minimization and transparency, reasonable security, no personalization) to reduce the impact of the permitted uses on the privacy of the user. Shane already adressed proportionality. We talked about subsidiarity, as in browser based privacy friendly alternatives before at length, so I will not touch on those now. We also heard that in the EU storing/reading unique identifiers to/from the browser without explicit consent is a violation of the user's privacy. We also talked about the mis-use of the term anonymous data. In my view, unique identifiers that can be used to follow user behavior across sites MUST be treated as personal data and a company MUST make representations accordingly.

To me harm can be approached the same as risk. So mitigating harm is based on reducing the chance of harm and/or reducing the impact on the user's privacy.

Some elements that IMHO lead to harm that I would like to bring to the table are:

- de-contexualisation: data about the behavior of a user collected in one context MUST NOT be applied in other contexts.;
- information dis-balance: it MUST be disclosed to the user which specific categories of (online/offline/inferred) data were used to display a specific ad;
- intransparent monetisation and customer value: it is often not clear what the business proposition is, let alone what value the user represents. This SHOULD be part of the informed choice the user gets;
- data retention: data minimization MUST include data retention. Also, if a compare-and-forget alternative works for a permitted use, this solution MUST be the preferred choice for implementation.

Rob

Received on Thursday, 11 October 2012 23:36:04 UTC