- From: Haakon Bratsberg <haakonfb@opera.com>
- Date: Thu, 15 Mar 2012 23:55:25 +0100
- To: "Roy T. Fielding" <fielding@gbiv.com>
- Cc: Rigo Wenning <rigo@w3.org>, Tracking Protection Working Group WG <public-tracking@w3.org>
On Mar 15, 2012, at 8:37 PM, Roy T. Fielding wrote: > On Mar 15, 2012, at 2:19 AM, Rigo Wenning wrote: > >> Roy, Jonathan, Shane, >> >> On Wednesday 14 March 2012 11:39:55 Shane Wiley wrote: >>> Please understand these activities are to PROTECT users and businesses alike >>> (depends on the attack). I'm hopeful we don't purposely create real risk >>> of harm to users in our attempts to "lock down" the DNT standard. >> >> Security vs Privacy is a big classic in data protection. Our forefathers of >> data protection in the seventies said that good data protection is requiring >> more secure systems to protect also against abuse of personal information. So >> they tried to harmonize security and data protection. >> >> On the one hand, I have a lot of sympathy with Roy warning us to open that can >> of worms. I would be very reluctant to include security-related provisions >> into the two Specifications. On the other hand, I also have a lot of sympathy >> for the suggestion to use the present expertise to have some privacy >> suggestions for the fraud-fighters in the Web's payment channel. >> >> Because PROTECT is relative. I'm pretty sure that Assad claims to PROTECT >> Syria. So only saying "protect" as a use limitation doesn't save our live >> here. A best practices document on fraud protection for ad companies would be >> cool. This could determine unnecessary data collection and identify doubtful >> sharing practices that would allow to abuse the data collected for fraud >> protection. In one word, make fraud protection for the web smarter to some >> extend, privacy wise.. And I think that in a second generation, we could have >> a framework where a service agrees to back down a bit because the users have >> decided (via DNT) not to be as highly secured because they favor privacy in a >> given context. > > That's all true, Rigo, but it has nothing to do with DNT. DNT does not exist > to solve all privacy problems. User preferences cannot solve security problems. > There is a continual tug of war between security and privacy, yes, but it is > not our war to resolve. As I said, I strongly encourage regulators to take this > on directly, if they have not done so already, since it is not a matter that > user preferences/consent can resolve. Fraud control is not about consent. > > Data retention for the sake of fraud control should be limited regardless of > the DNT signal because it should be assumed that the user has not consented > (they are usually not given a choice). I don't know how they should be limited. > I am certain that this working group is incapable of reaching consensus on > how fraud control must be limited, since it depends on the nature of the > fraud, the nature of what is being protected, and the nature of the > organization doing the protection. > > In short, we have neither the time, nor the expertise, nor the authority to > address this problem in general, other than to say that: > there exists an exemption for fraud control and data collection/retention/use > under that exemption must be limited to what is necessary for that fraud control. Couldn't agree more. Haakon
Received on Sunday, 18 March 2012 20:48:30 UTC