- From: Nicholas Doty <npdoty@w3.org>
- Date: Sat, 11 Feb 2012 18:39:26 -0800
- To: Shane Wiley <wileys@yahoo-inc.com>
- Cc: David Singer <singer@apple.com>, JC Cannon <jccannon@microsoft.com>, Jonathan Mayer <jmayer@stanford.edu>, Ninja Marnau <nmarnau@datenschutzzentrum.de>, "Amy Colando (LCA)" <acolando@microsoft.com>, "Frank.Wagner@telekom.de" <Frank.Wagner@telekom.de>, "public-tracking@w3.org" <public-tracking@w3.org>
On Feb 9, 2012, at 1:04 PM, Shane Wiley wrote: > I believe we're in agreement that we're not aiming at bad actors - but actually "all actors" with respect to DNT. Where I believe your response and my intended point diverge is that it appears at times that the Working Group is going out of its way to punish the bad actors when in reality good actors will be taking the brunt of this attack. If we "trust" good actors to be good actors (I believe Roy used similar language), then "use limitations" should be an acceptable starting point for the working group. One more concurrence on considering both bad actors and good actors and recognizing that DNT will be most applicable for "good actors". By "good actors", I would mean organizations we believe are mostly likely to keep stated promises. As David points out in his department store example, there are many privacy concerns around data collection and retention, even if the actors don't violate promises not to sell/trade info on to other parties. I thought we actually had agreement in the goals/Issue-5 discussion at Brussels that the only proposal that didn't have any support was the one that didn't impact collection/retention (i.e. Do Not Target). I suspect that use limitations alone are unlikely to be sufficient to address the substantial privacy concerns around online tracking. On the other hand, if you're suggesting (via "starting point") that use limitations will be useful for the specification even though we'll also address collection/retention, then +1 and apologies for the digression. Thanks, Nick > -----Original Message----- > From: David Singer [mailto:singer@apple.com] > Sent: Thursday, February 09, 2012 11:19 AM > To: Shane Wiley > Cc: JC Cannon; Jonathan Mayer; Ninja Marnau; Nicholas Doty; Amy Colando (LCA); Frank.Wagner@telekom.de; public-tracking@w3.org > Subject: Re: [Issue-71] Proposed Text for Issue 71 > > > On Feb 8, 2012, at 22:40 , Shane Wiley wrote: > >> David, >> >> In the examples you've provided: >> >> "I don't think it's only 'bad actors', alas. It is the very existence of the data that causes concern. What happens if it leaks? The management changes? Someone makes a mistake? Law authorities want to look at it? The company gets bought or merged? And so on." >> >> If an organization had retained data for DNT:1 events for specific operational purposes and then one of the voluntary events occur (mgmt change, purchase/merger) such that the information is used outside of the DNT standard exceptions, then that organization is a "bad actor" -- and pursuit to the claims in their privacy policy at the time of data collection (or a response header), it would be my expectation that they felt the full force of the law in all jurisdictions they operated in. >> >> The security oriented risks such as "leaks" and "someone makes a mistake" are real concerns but when balanced against the real-life risks anonymous cross-site data collection presents we need to be careful to ensure the level of compliance burden is proportionate. It is for this reason that "use based limitations" are the most appropriate outcome for this particular set of privacy issues. >> >> I understand the desire for absolutist remedies (radically short retention periods, outright data destruction, etc.), but the cost to implement these combined with the impact to business continuity will be too great to have many organizations wish to implement DNT. >> >> - Shane > > Shane > > thanks for the thoughtful response. I think we're in agreement; we need a balanced specification that has real changes in terms of the privacy of consumers, yet is implementable by the business community. (If no-one implements, we'll have had no effect on privacy at all :-(). > > I just want to lay to rest the idea that we're protecting against 'bad actors', and that's the only concern. In some sense, that's not the concern at all, since we can't (except by providing sharper instruments with which to confront them, which is not a small thing in itself). > > If you went to the department store to buy a shirt, and at the entrance someone took your picture and said "welcome back Mr. Wiley", and as you walk in someone says "that sweater you looked at last week is on sale", and further into the store someone said "would you like a pill-case for the prescription you just picked up down the road?", and then someone said "many people with your income and background very much like our new line of briefcases", and later "your daughter would really like this pink hair bow, it would go with blouse your wife bought her in Paris", you'd be freaked. What's going on that you know all this about me? > > That 'what the heck' is only tangentially connected with 'bad actors'. > > David Singer > Multimedia and Software Standards, Apple Inc.
Received on Sunday, 12 February 2012 02:39:39 UTC