Re: [proposals] Notice and consent debate (#5)

To add to @alextcone's excellent notes above, I wanted to clarify a few points.

The important point here is that notice and choice does not provide adequate protection and does not enable people to make decisions that correspond to autonomous choice. What this means is that, as a standards group, we cannot create a potentially dangerous piece of technology and then shirk our responsibilities by saying "it's okay, we'll just prompt the user for their consent." Not only is that an established bad practice in privacy, it is _also_ an established bad practice in browser UI. The Web community has been struggling for years, nay, decades with permission systems for powerful capabilities and it remains an unsolved problem. Notice and choice is to privacy what ActiveX prompts were to security. We might just be better off if we don't reproduce the same mistakes we made in the 90s.

![image](https://user-images.githubusercontent.com/38491/153466978-2fae5f1d-4829-464a-a242-2ca71c800487.png)

That we will design systems that are safe without notice and choice doesn't mean that we can magic notice and choice away when it is legally required. Without getting into details, there are definitely potential conflicts between the ePrivacy Directive and some privacy-enhancing techniques. The fact that ineffective laws exist does not lessen the value of producing technology that delivers utility while being private by design. If and when ePD is an issue, we can explore options. One would be to speak to ask legislators to carve out exemptions for specific PET designs. Another, if we can demonstrate credible enforcement in the system, could be to bring an Article 40 Code of Conduct to a DPA or the EDPB that would waive (as is already done in some cases) local storage requirements. It's too early for that now, though, we can cross that bridge when we get there. So @jdelhommeau, I think we are all agreed on this?

@anderagakura I'm glad that you enjoyed Barrett's article, I find her to be a very effective (and funny) writer. I agree with you that we will still need transparency. The point here is not that we should eliminate transparency (or even choice, when it is useful and effective), but rather that we shouldn't _rely_ on transparency or choice to make a system safe.

It's like the **Linux Problem**: being _able_ to tinker with something is liberating, being _required_ to tinker with a thing before it can be useful is alienating.

I believe that we have consensus on the following statements:

* Notice and choice is not an appropriate mechanism with which to protect people and meet their expectations of privacy. This statement is well documented in the scientific record.
* Making a system privacy-friendly does not necessarily eliminate legal requirements in all jurisdictions. There may be solutions there, but we will have to cross that bridge when we get there.
* Not _relying_ on notice and choice does not mean _eliminating_ notice and choice. In cases in which it can contribute, for instance to transparency, it can still add value.

Unless there an unresolved objection in substance, I propose that we close this issue. This point is discussed in the TAG's privacy document (forthcoming) and therefore I don't think that we need to capture it in the PAT-specific principles document.

-- 
GitHub Notification of comment by darobin
Please view or discuss this issue at https://github.com/patcg/proposals/issues/5#issuecomment-1035273304 using your GitHub account


-- 
Sent via github-notify-ml as configured in https://github.com/w3c/github-notify-ml-config

Received on Thursday, 10 February 2022 18:18:08 UTC