Re: Decisions by the group

Thanks Christine & Co, for doing such a great job getting this work group producing useful stuff. 

A special thanks also for getting the conversation started, Hannes, and to those who have been working on the new privacy consideration document - it is a monumental document and a great improvement. I have already been finding it very helpful. 

In terms of Hannes's post,  3a, Notice & Consent, as well as in the context of JC comments to this post, the technical application of opt-ins and outs is woefully too narrow to deal with the policy and transparency challenges N&C are meant to address. 

Currently, the N&C apparatus on the internet is fundamentally broken, and it is questionable if the administration of real digital identities is even legal with current N&C implementations. As these policies are not fair, and the notice provided is not adequate, there is only the choice 'of adhesion', to use or not to use a service for a web-service user.

Par from the Biggest Lie has pointed this out on the list a couple of weeks ago, and just three days ago JC wrote: 

" "Notice and Consent" to include Transparency.....<snip>"

"You give a fine example of a consent mechanism for apps, http://blog.benward.me/post/968515729, but it wouldn't work well for websites as the fatigue you mention would quickly set in. Addressing N&C for websites is a tough issue and the current cookie model makes it harder. If we could find a persistent, ubiquitous means for users to be able indicate sites that they trust or distrust without all the popups, banners, and opaqueness (e.g. Trust as4rg.net) we would get a huge high-five from Internet users around the globe."


I entirely agree with his sentiment. Not only would this produce a big high-five from "Users", but also for all of the privacy-related working groups.  

I suggest we need to seriously look at how to open notice, and by doing so open the management of consent! 

Currently N&C is a relic of the Industrial Age, each policy/contract is ad-hoc and not systemically usable - thus contributing to a big lie about control and access over the use of personal information and identity. A new collective effort to create an open standard for notices, and by doing so open the policy infrastructure, is not only required but , I would argue, highly adoptable.

Over the last few years, I have been catologuing the technical, legal, and social issues with the existing N&C infrastructure. 

I believe this approach to be the successful approach because Openness is a privacy principle in many countries, and Notice is the first and most fundamental FIP's. Notice is also the only common principle in every privacy-legislated jurisdiction. In effect, Notice is a vehical for information control and is at the interesction of all these issues. This can be observed by looking at the global N&C  infrastructure that already exists, online and offline.   

As for adoption, it appears that an open notice standard, by law, would almost immediately be required legally (in multiple jurisdictions) for not only personal information management but also for security, health and safety notices, and physical signs as well. Creating just a standard format or meta-format for notice communication, external to websites (even the www), would enable collaboration, innovation, immense economic performance, and is a clear path to privacy by design solutions for existing/emerging information personalisation, control, and management challenges. 

In this regard, I agree - a lively discussion is needed. As a result, we are calling for participation in a new effort called opennotice.org. If you want to help found such an effort, please join the Google group: opennotice@googlegroups.com. 

Warm Regards, 

Mark Lizar

---
On 25 Jul 2012, at 15:11, Christine Runnegar wrote:

> Hi Hannes,
> 
> Thanks for raising these important questions. This is our perspective, having regard to the charter:
> 
> 1) Our goal is to develop a guidance document for W3C standards (at large).
> 
> In particular, this would include providing guidance on:
> - identifying privacy risks/vulnerabilities associated with Web standards
> - how to build privacy into the design of a specification 
> 
> Note: there may also be value in also developing additional specific guidance for particular standards.
> 
> And, we can look more broadly if we want to.
> 
> 2) The target for the guidance document for W3C standards is the standardization community (specifically, people who develop W3C standards) and those who then deploy them.
> 
> Note: there may also be value in developing additional guidance targeting implementers and deployers (e.g. best practice documents).
> 
> 3) We do not have the answer to this one, and would encourage discussion on this list.
> 
> One thing to keep in mind is that perhaps more than one model (or strategy) may be relevant depending on the standard.
> 
> However, before we get to guidance, perhaps it would be useful for the group to identify privacy risks/vulnerabilities associated with Web standards. Again, we encourage discussion on this list.
> 
> Christine and Tara
> 
> On Jul 20, 2012, at 1:58 PM, Hannes Tschofenig wrote:
> 
>> At yesterday's conference call I was asked about my opinion what to do in the PING working group. While I tried to answer that question on the phone I am not sure I got my points accross. Here is an attempt to summarize my thoughts. 
>> 
>> There are three aspects for group members to decide:
>> 
>> 1) What is the scope of a guidance document? 
>> 
>> The W3C covers a pretty broad scope of work. In response to Robin's writeup that was focused on privacy guidance for those who develop JavaScript APIs I argued that this scope is too narrow. There is other work in the W3C that is looking for guidance as well. My favorite example is CORS.
>> 
>> What does the group think is useful to cover?
>> 
>> 2) Who is the target audience of the recommendation?
>> 
>> I had shared my views about this topic already in previous mails and tried to explain what the difference between the standardization community, implementers, and deployment is.
>> 
>> The group has to decide what the audience is because the recommendations will be different for these different groups.  
>> 
>> 3) What is your model for privacy protection?
>> 
>> Before going into the details of providing guidance it is useful to think about the main direction.
>> I have seen different themes (and all have their pros & cons). Here are some examples (and one could combine different approaches):
>> 
>> a) Notice and Consent model
>> 
>> Before the collection of data, the data subject should be provided with a notice of what information is being collected and for what purpose and an opportunity to choose whether to accept the data collection and use.
>> 
>> There are also further design aspects about when this consent should happen. One model is to push it to contracts (e.g., terms of service and privacy notices when you sign up for the service) and another model is to ask the user at the time of sharing (in real-time). As a simplified summary, the latter may require additional specific work or integration of some protocol mechanisms (e.g., OAuth) and the former doesn't. 
>> 
>> The challenges here are that sometimes (often?) the users aren't asked before sharing happens. An example from an article published yesterday about the reality: http://www.cultofmac.com/179733/19-of-ios-apps-access-your-address-book-without-your-permission-until-ios-6-report/
>> 
>> We also want to avoid notification fatigue and want to provide a good choices instead of take-it or leave-it schemes (which we see all too often today). Here is an example of a better permission dialog (of course a fake screen): http://blog.benward.me/post/968515729
>> 
>> This sounds like one has to standardize the user interface but this is not necessary. Instead, one can talk about the user interaction in an abstract way. Barry Leiba provide an example in this document for OAuth: 
>> http://tools.ietf.org/html/draft-leiba-oauth-additionalsecurityconsiderations-00
>> 
>> b) Data Minimization
>> 
>> With this approach the idea is that you figure out what data you need as a bare minimum for your service to work and design the system accordingly. Then, the end devices only provides the data to various service providers that they really need. This is an approach that is often chosen by researchers since it has a lot of impact on the overall system design. The beauty of this approach is that when information is not available to a party then that party cannot leak it or cannot share it in a a way that violates the user's expectations. 
>> 
>> One challenge is that those who design the system don't like to restrict themselves.
>> 
>> c) User preference indication
>> 
>> This model is the result of realizing the some parties already get data anyway (or have it already) and so we want to tell them how to use it. This is the GEOPRIV sticky policy approach or, a more recent approach, the DNT header. 
>> 
>> The drawback of that approach is that it heavily relies on data protection authorities (DPAs) to enforce non-compliance. Whether there will be any enforcement remains to be seen. 
>> 
>> 
>> 
> 
> 

Received on Friday, 27 July 2012 12:54:26 UTC