W3C home > Mailing lists > Public > public-tracking@w3.org > February 2012

Re: Action-101: Revise text for Issue-6, What are the underlying concerns?

From: Rigo Wenning <rigo@w3.org>
Date: Thu, 23 Feb 2012 22:48:44 +0100
To: Nicholas Doty <npdoty@w3.org>
Cc: public-tracking@w3.org, John Simpson <john@consumerwatchdog.org>, "Aleecia M. McDonald" <aleecia@aleecia.com>, Matthias Schunter <mts@zurich.ibm.com>, Thomas Roessler <tlr@w3.org>
Message-ID: <7558577.Crn5zz5xzB@hegel.sophia.w3.org>
Here is my proposed text: 

Privacy as first mentioned by Louis Brandeis in 1890 is the right to be left 
alone. But Privacy is also the term used by Alan Westin in 1967. And here, 
Privacy is about large dossiers about people in computers that are used to 
make decisions about those people. And this triggered a discussion about the 
autonomy of decision making of the individual in a completely computerized 
society that is ongoing since then. And this Specification is a very part of 
that discussion. 

The fear is that by having large profiles about people, identified or 
identifiable, there may be two phenomena: 

1/ Direct influence 
The government may have access to such profiles and gain too much information 
about citizens. This information is then used to influence opinions in a 
certain direction. But it could also be used to identify and target the 
leading intellectuals and their networks.

2/ Self constraining and self censoring

Every one of us is trying to maintain a certain image to the outside world. 
The success of social networks shows that many people really care about their 
image. But everyone of us has also sides that do not match the image we try to 
convey. This may be a disease, this may be a certain part of our character. If 
we do not know, what others know about us, how could we forge our image? Many 
people assume that the outside world knows more than it really knows. Once 
people realize what the big unknown may know about them, they start to worry. 
This leads to a self constraint behavior to avoid further collection. Less 
searches. Searches only for non sensitive things. Having AIDS a person 
wouldn't look at AIDS pages anymore. People would not look at controversial 
information and opinions anymore. The fear has hampered the opinion building. 
So not the profile itself created the bad effect, but the imagination about 
what those profiles could contain and be used for. With the addition of some 
high publicity cases and some running urban myths, e.g. accidentally being on 
a no-fly list or the like, the fear establishes itself in large portions of 
society. And this has the potential is to seriously hamper the opinion 
building that is so crucial for a democratic society. 

Hope this explains why I sometimes say: We kill our democracies by accident. 
Because I don't think people creating those large profiles today are really 
aware of the psychological dangers they create. It's all only about 
advertisement, isn't it? It isn't actually, so let's work that it is again.

I think this text would also benefit from a review by a native english 


On Saturday 11 February 2012 19:12:40 Nicholas Doty wrote:
> On Feb 11, 2012, at 12:57 PM, Rigo Wenning wrote:
> > 2/ While the consumer-protection aspect is clearly stated, the
> > protection of democracy aspect is not clear and is hidden in the "human
> > rights" statement.
> Rigo, do you want to suggest some text to explain the democratic concern?
> From my part, some text we came up with in one of our small groups in 
Brussels may be relevant to this enumeration/elaboration of privacy concerns:
> > * Experiencing targeting based on data about me from unexpected sources.
> > (In many cases, large profiles of data about me or people like me
> > already exist, compiled from either online or offline data.) *
> > Retention of browsing history data by unexpected sources.
> Thanks,
> Nick
Received on Thursday, 23 February 2012 21:49:12 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 17:38:34 UTC