Re: Proposed Text for Local Law and Public Purpose

Hi Walter,

As always, I appreciate your thinking on this, but it seems we see 
things quite differently. My responses are inline.

On 10/24/12 6:27 AM, Walter van Holst wrote:
> On 10/24/12 4:13 AM, David Wainberg wrote:
>> On 10/23/12 8:34 PM, Dan Auerbach wrote:
>>> the onus is on me to explain privacy risks through examples
>> Please do. There continues to be reluctance to specify the exact risks
>> we're trying to address with this standard. It would be extremely useful
>> for us to finally enumerate the problems we're trying to solve so that
>> we can zero in on the appropriate solutions. Thanks!
> I am sorry David, but we're dealing here with fundamental human rights.
> To take a horrible historical analogy: there was much resistance against
> abolition of slavery in the Southern US states for economic reasons.
You're comparing online advertising to the enslavement and brutal abuse 
of a race of people? With respect, I think hyperbole overshadows your 
point, and some may even find it offensive. Perhaps you can think of a 
more apt analogy.
> The
> mere fact that a billion dollar industry has sprung up around what is
> essentially stalking internet usage to the point that Orwell comes
> across as an optimist does not put a burden of proof on the side of
> those who advocate for privacy and freedom of expression to provide
> exact numbers. Dan was talking about giving examples, not about 'exact
> risks'.
Aside from whether this point is hyperbole as well, it is irrelevant. 
DNT as conceived by this working group will have little to no impact on 
any Orwellian data collection. First parties online, all sorts of 
parties offline, and more importantly, governments everywhere will 
continue to have the ability to collect large amounts of data about all 
of our behavior online and off. If that's the problem we're trying to 
solve, we're way off base. Constraining the uses that 3rd parties are 
allowed to make of data collected online will give no net benefit to 
users in this regard.

Participants in this working group should stop using problems we are not 
solving as rationales. To my initial point, if you want to enumerate the 
problems we are trying to solve, so that we can direct our solutions to 
those, that might be a good route. For example, if access to data and 
misuse of it by governments is a particular issue, let's explore that. 
It's been mentioned regularly, but has never been approached as a 
discrete concern we are trying to address.
>
> Do not forget that a lack of privacy also erodes freedom of expression
> by putting barriers to access information.
I'm not entirely sure I understand this point, but I think I see it 
completely opposite. Third party online advertising services make 
possible a much wider range of content and services than users would 
have access to otherwise. Without the diversity that third parties play 
an essential role in fostering, users' access to information would be 
more limited, not less. Distributed control, network effects, low 
barriers to entry -- these are the qualities that make the Internet such 
a democratic medium. Shouldn't we be sure this standard does not 
undermine these?
>
> To give an example of a risk we're talking about is the prevalence of
> 'like' buttons on pages, including those of newssites. That allows
> Facebook to compile a list of all news articles a Facebook user reads. I
> think a society in which entities not only know which newspapers you
> read, but also which articles and are even capable of measuring how much
> time you've spent on each of them, is post-Orwellian.
Again, it appears the DNT we've conceived here will have not any 
substantial effect on this.
> This also brings me back why I don't think an exception for third-party
> trackers that also happen to be first-party trackers is in place.
>
> It should be mentioned there is a business interest in behavioural
> advertising. And that interest is even perfectly legitimate under sound
> consent conditions. I think this discussion is essential fruitless
> because it is based on a fundamental wrong assumption: that there is a
> legitimate business interest in processing data for which users have
> given an express opt-out signal. At the core we all (at least I hope)
> want a nimble standard to provides a mechanism for expressing consent or
> lack thereof.
>
This is not how it is in the US. Our law and our culture around these 
issues are different. Although we have had many conversations in this 
group about how we can try to craft DNT to suit needs in the EU, I'm not 
sure there's an appetite here to import European law to the US via this 
standard. This is the justification for my compliance token proposal: 
there are significant differences we may not be able to accommodate in a 
monolithic standard.

-David
>

Received on Wednesday, 24 October 2012 11:56:51 UTC