- From: Rigo Wenning <rigo@w3.org>
- Date: Wed, 20 Apr 2011 13:51:41 +0200
- To: public-privacy@w3.org
- Cc: jeanpierre.lerouzic@orange-ftgroup.com, karld@opera.com, david@remahl.se
On Wednesday 20 April 2011 09:36:21 jeanpierre.lerouzic@orange-ftgroup.com wrote: > Karl> Do we have the tools to be forgotten? Yes Karl, this is the right question for W3C. But in order to provide an answer, we have to agree on the meaning and the degree of "forgotten". [...] > > Privacy enforcement tools that rely on OS DRM are (IMO) an opportunity for > the browser industry. > Jean-Pierre, in the PrimeLife project we realized how close DRM and privacy really are. But we also realized that too much control kills the tool. So the idea is not to create the tool that is perfectly engineered to not leak at all. The idea is to create a tool that does what is socially expected. The 100% control are socially expected in DRM, not in all areas of privacy. To be forgotten - and here it joins the discussion triggered by Karl and David Rehmal - would not mean that bits are vanished via magnetic fields, erased. It could mean simply not to have those bits publicly accessible anymore, have them in a special guarded search engine only. So "to be forgotten" is a new challenge on a different type of access control. As far as I can see the discussion in Brussels, "to be forgotten" is mainly about the difficulties to leave a social network (a very controlled environment) without leaving public traces. In how far can private entities keep information against the will of their client? This goes beyond the traditional borderlines of privacy IMHO and also needs new argumentation. Best, Rigo
Received on Wednesday, 20 April 2011 11:52:05 UTC