W3C home > Mailing lists > Public > public-tracking@w3.org > February 2012

Re: Deciding Exceptions (ISSUE-23, ISSUE-24, ISSUE-25, ISSUE-31, ISSUE-34, ISSUE-49)

From: Alan Chapell <achapell@chapellassociates.com>
Date: Thu, 02 Feb 2012 10:00:24 -0500
To: Jeffrey Chester <jeff@democraticmedia.org>, Bryan Sullivan <blsaws@gmail.com>
CC: Jonathan Mayer <jmayer@stanford.edu>, "public-tracking@w3.org (public-tracking@w3.org)" <public-tracking@w3.org>
Message-ID: <CB500CF4.12ED3%achapell@chapellassociates.com>
Jeff - 

As I understand it, the compelling privacy concern articulated by you and
others is the collection and use of data collected across sites that
alters the user's experience and/or is used to interfere with user privacy
interest (e.g., visit McDonalds.com, cholesterol.com and having your
insurance rates rise as a result.) Do I have that correct?



Cheers,,

Alan Chapell
Chapell & Associates
917 318 8440






On 2/2/12 9:40 AM, "Jeffrey Chester" <jeff@democraticmedia.org> wrote:

>Bryan:  I will be happy to help elucidate the user privacy case.  As you
>know, both the FTC and EU expect the DNT standard to seriously address
>the expansive data collection practices that have been routinized.  If
>there wasn't such a compelling privacy concern, we all wouldn't be doing
>this.  Indeed, I am happy to meet you half way on the discussion.  But
>its the current business model that has created the political need for an
>effective DNT, including on the mobile/location environment.
>
>
>> 
>> To balance the approach, in my view any argument against exceptions must
>> satisfy an equally rigorous test:
>> 
>> 1) Specifically defined. Data that is considered privacy sensitive must
>>be
>> clearly delineated, re collection, retention, and use. Any such data
>>that
>> is subsequently identified by business stakeholders as important to
>> Business As Usual (BAU) apart from the narrow purpose of cross-site
>> tracking, needs a privacy sensitivity explanation that is
>>extraordinarily
>> explicit.
>> 
>> 2) No blanket restrictions. We should grant or deny an exception on the
>> merits of how it balances privacy and commerce, not solely upon a
>>specific
>> privacy concern.
>> 
>> 3) Compelling privacy concern. Data that is considered privacy sensitive
>> must be clearly delineated in terms of why it is so. A bald assertion
>>that
>> collection, retention, and use of specific data should be disallowed is
>> inadequate. I expect privacy advocates to explain, with specificity,
>>what
>> privacy concern is related to the collection, retention, and use of
>> specific data items. Further, I expect privacy advocates to point to
>> documented examples of how such data collection, retention, and use has
>> been at the root of publicly visible privacy issues in the past, i.e. Is
>> not purely a theoretical concern.
>> 
>> 4) Significantly furthers the privacy need.  I expect privacy advocates
>>to
>> explain exactly how and to what extent a proposed restriction will
>>further
>> the compelling privacy needs they have identified. Note that public
>> discussion of some specific privacy concerns or BAU purposes may be
>>deemed
>> inappropriate, as more than good-intentioned people may be following
>>this
>> list. Therefore we may need some process for discussion on a private
>>list
>> or in F2F for specific items of particular sensitivity.
>> 
>> 5) Justified minimization.  If there is a technical alternative to
>>current
>> BAU technical approaches, the relative value of that approach must be
>> clarified by privacy advocates. The burden will be on privacy advocates
>>to
>> show that an alternative technical approach has relative value in a
>> balance of privacy and commerce concerns (including estimation of
>> per-subscriber costs to switch to using the new technology/approach).
>> 
>> 
>> 
>> 
>> 
>> Without a clear justification in these terms, I would err on
>>preservation
>> of BAU, within again within the blanket use restriction for cross-site
>> tracking.
>> 
>> Best Regards,
>> Bryan
>> 
>> On 2/1/12 6:45 PM, "Jonathan Mayer" <jmayer@stanford.edu> wrote:
>> 
>>> The working group has made great progress on the broad contours of the
>>> definition document, and the conversation is shifting to specific
>>> exceptions.  With that in mind, now seems an appropriate time to
>>> articulate my views on when and how exceptions should be granted.
>>> 
>>> At a high level, we all agree that exceptions reflect a delicate
>>>balance
>>> between consumer privacy interests and commercial value.  There are, no
>>> doubt, substantial differences in opinion about where that balance
>>>should
>>> be struck.  I hope here to clarify my approach and help others
>>>understand
>>> why I find recent proposals for blanket exceptions to be non-starters.
>>> 
>>> In my view, any exception must satisfy this rigorous six-part test.
>>> 
>>> 1) Specifically defined.  An exception must clearly delineate what data
>>> may be collected, retained, and used.  If a proposed exception is
>>>purely
>>> use-based, that needs to be extraordinarily explicit.
>>> 
>>> 2) No special treatment.  We should grant or deny an exception on the
>>> merits of how it balances privacy and commerce, not a specific business
>>> model.
>>> 
>>> 3) Compelling business need.  A bald assertion that without a specific
>>> exception Do Not Track will "break the Internet" is not nearly enough.
>>> I
>>> expect industry stakeholders to explain, with specificity, what
>>>business
>>> purposes they need data for and why those business purposes are
>>> extraordinarily valuable.
>>> 
>>> 4) Significantly furthers the business need.  I expect industry
>>> participants to explain exactly how and to what extent a proposed
>>> exception will further the compelling business needs they have
>>> identified.  In some cases cases, such as security and fraud
>>>exceptions,
>>> this may call for technical briefing.
>>> 
>>> 5) Strict minimization.  If there is a privacy-preserving technology
>>>that
>>> has equivalent or nearly equivalent functionality, it must be used, and
>>> the exception must be no broader than that technology.  The burden is
>>>on
>>> industry to show that a privacy-preserving alternative involves
>>>tradeoffs
>>> that fundamentally undermine its business needs.  In the context of
>>> frequency capping, for example, I need to hear why - specifically -
>>> client-side storage approaches will not work.  In the context of market
>>> research, to take another example, I would need to hear why statistical
>>> inference from non-DNT users would be insufficient.
>>> 
>>> 6) Balancing.  There is a spectrum of possible exceptions for any
>>> business need.  At one end is a pure use-based exception that allows
>>>for
>>> all collection and retention.  At the other end is no exception at all.
>>> In between there are infinite combinations of collection, retention,
>>>and
>>> use limits, including exceptions scoped to privacy-preserving but
>>> inferior technologies.  In choosing among these alternatives, I am
>>>guided
>>> by the magnitude of commercial need and consumer privacy risk.  I am
>>>only
>>> willing to accept an exception where the commercial need substantially
>>> outweighs consumer privacy interests.
>>> 
>>> I understand example exceptions may be helpful in understanding my
>>> thinking, so here are a few from the IETF Internet-Draft.
>>> 
>>>>   3. Data that is, with high confidence, not linkable to a specific
>>>>       user or user agent.  This exception includes statistical
>>>>       aggregates of protocol logs, such as pageview statistics, so
>>>>long
>>>>       as the aggregator takes reasonable steps to ensure the data does
>>>>       not reveal information about individual users, user agents,
>>>>       devices, or log records.  It also includes highly non-unique
>>>>data
>>>>       stored in the user agent, such as cookies used for advertising
>>>>       frequency capping or sequencing.  This exception does not
>>>>include
>>>>       anonymized data, which recent work has shown to be often re-
>>>>       identifiable (see [Narayanan09] and [Narayanan08]).
>>>>   4. Protocol logs, not aggregated across first parties, and subject
>>>>       to a two week retention period.
>>>>   5. Protocol logs used solely for advertising fraud detection, and
>>>>       subject to a one month retention period.
>>>>   6. Protocol logs used solely for security purposes such as intrusion
>>>>       detection and forensics, and subject to a six month retention
>>>>       period.
>>>>   7. Protocol logs used solely for financial fraud detection, and
>>>>       subject to a six month retention period.
>>> 
>>> 
>>> I would add, in closing, that in difficult cases I would err on the
>>>side
>>> of not granting an exception.  The exemption API is a policy safety
>>> valve: If we are too stringent, a third party can ask for a user's
>>> consent.  If we are too lax, users are left with no recourse.
>>> 
>>> Best,
>>> Jonathan
>>> 
>>> 
>> 
>> 
>> 
>> 
>
>
>
Received on Thursday, 2 February 2012 15:07:43 UTC

This archive was generated by hypermail 2.3.1 : Friday, 3 November 2017 21:44:44 UTC