- From: David Wainberg <david@networkadvertising.org>
- Date: Sat, 13 Oct 2012 09:30:00 -0400
- To: Rigo Wenning <rigo@w3.org>
- CC: public-tracking@w3.org, Alan Chapell <achapell@chapellassociates.com>, Shane Wiley <wileys@yahoo-inc.com>, Vincent Toubiana <v.toubiana@free.fr>, Jeffrey Chester <jeff@democraticmedia.org>, Jonathan Mayer <jmayer@stanford.edu>
Hi Rigo, This is a very interesting discussion, but still vague. Alan and Shane have made excellent points about the need to identify particular harms we're trying to solve for. Without doing so we're wandering around in the desert without direction. This is not, as you say, "like looking for security breaches." Security solves for a particular problem: unauthorized access to data. And there is an understanding of the value or risk associated with the data to be protected, and the security is scaled proportionally. We are not in this situation here, even though we've asked many times to identify specifics. Vincent recently raised one specific case -- access to server logs in a civil legal proceeding -- which is very helpful for discussion. It can help us to zero on the specific problems that are applicable, and then focus on specific, reasonable solutions to those problems. I'm truly baffled by the reluctance to do more of this. -David On 10/12/12 5:12 PM, Rigo Wenning wrote: > On Thursday 11 October 2012 16:27:06 Shane Wiley wrote: >> * No harm ever came to users > Can we please stop that silly discussion and go back to real? > > Alan, it is clear that the concrete harm of profiles done by ad > networks is very hard to determine in a world that is full of NDAs > and settlements. And I agree that you need to know about the harms > in order to determine the protective measures. So you have a point. > But it is like looking for security breaches. I will still try (and > this list is not exhaustive or in any way scientific or correct) > > The fact that the industry pays over 10 times more for targeted > advertisement and profiles should be enough evidence that there is > value. Money is an information system after all. But this value is > not neutral. The value is the ability of the industry to reduce the > autonomy of consumers. Apart from annoying pop-ups and targeted > spams that factor in to the psychology in the market place, people > find it really creepy that the "unknown" knows so much about them. > Go read Foucault to assess the chilling effects of that process. > Reducing autonomy in concrete means manipulation to sell goods at > higher prices than otherwise possible. > > You look for a smoking gun? I have been long time hesitant to > provide it. And I still don't. But I can report from the hearing in > the EU Parliament on the new data protection regulation where two of > the most respected advocates were reporting people's concerns that > governments siphon all data and profiles that have been created. It > is not advertisement as such, it is the profiles created and the > targets identified. People are not as naive as some other people may > want to believe. DNT is a way to say: Look the other way and don't > record for the spooks. They may still find something in your > accounting data, but less then the full profile and not forever. > > A further psychological component adds to this. We say "do not > track" and probably, for marketing reasons, can not pedal back > behind this term. If someone selects "do not track" while there is > still tracking going on and just the creepy symptoms are suppressed, > that's even worse and more unpredictable than doing nothing. A > system has to be predictable and reliable. And if I say to the > service "please look the other way" and they still look with one and > a half eye, I'm not really getting what I want. Disappointed > expectations will add to the hostile environment the ad industry is > currently working in. This is not the achievement we are looking > for. > > Last but not least, there is not only concrete abuse, but the > abstract danger of large amounts of data. I have personal experience > with this as Legal counsel. Until 2003 W3C kept all logfiles for > historical reasons (thought was that we invented the Web and have to > keep stuff for the historians). Then we were the target of a > multitude of subpoenae that wanted to know who saw what when to > determine who was willfully infringing what patent (or to create an > allegation thereof). And I finally convinced the Sys-Team to > anonymize logs after 6 weeks. This helped. (we have a known script > and policy for that). Vincent tried to allude to this with the > Youtube case. There can be many attempts to get your profile. > > Now Alan can ask me: But this is also true for first parties. And > now I have to confess that I believe personally that the distinction > between first and third parties doesn't make much sense. Neither in > a dogmatic (legal) way nor in a risk based thinking. I think the FTC > found some settlement that made perfect sense for the concrete case > but created an unfortunate precedent for the US market. HTTP just > makes requests for elements and can't distinguish between first and > third parties (apart from same origin). So a harms based discussion > will always hurt itself with this distinction. On the other hand, > the TPWG has to accept some outside legal realities. First/Third was > brought in to reduce the scope of all the effort. Fine. For the EU > system, the distinction is irrelevant because of statutes, so > everybody is treated equally there. > > To conclude: If there would be no harm and no social outcry, we > wouldn't be sitting here and spending our time with this. Alan, I > also find it somewhat audacious to question the reality of the > entire data protection circus and the entire research done in this > space in the past 50 years. All a joke? But maybe the earth is flat > and we didn't realize. This said, a constructive questioning of the > concrete harms will bring us forward. But this needs that we come > out of the trenches and accept that "potential" abuse exists. The > discussion on harms should really now concentrate on the concrete > permitted uses. Trying to bomb "marketing" into "permitted uses" in > the presence of DNT;1 with the "no harm argument" doesn't help at > all. > > So my question is: Alan, what data collection and use do you want > that you can't do? This is precisely Walter's question (and I may > have the same cultural bias as Walter has, but please be indulgent > with us on this aspect) > > Rigo > > >
Received on Saturday, 13 October 2012 13:30:28 UTC