Re: [meetings] Agenda Request - Should PATCG be opinionated on which technologies are used to enable privacy? (#39)

Dear all,

I don't think that we can usefully "define privacy" for the context in 
which we work, but I think that we can reach consensus about how to manage 
privacy topics here. That's definitely worth discussing.

Privacy is the set of rules that govern information flows that are about 
people or that impact people. These rules are context specific and contexts 
can nest and overlap. (For the nerds out there, this is grounded in the GKC 
Privacy framework; it's well adapted to commons situations like ours.)

For the general question of privacy on the web, the TAG has a set of 
principles. They're not finalised, but they're usable. For the more 
specific context of advertising on the web, we should inherit and 
specialise them. To give one example, the TAG's principles have stern 
warnings about consent but don't rule it out entirely. The reason for that 
is because some Web processing is meaningfully consentable (eg. `[ ] 
Receive newsletter`). For advertising contexts, where the threat is sharing 
browsing, we know that that's not consentable and so we can have a 
specialised rule that excludes that approach.

I don't think that we can a priori come up with all the rules at once. 
We're going to find some corner cases because reality is complicated and 
that's fine. But we can agree to ground in TAG and to elaborate new rules 
as we progress. I think that that's actually better and more realistic than 
coming up with all the rules first.

Does that mean that we should be opinionated as to technologies? Yes, but 
we don't necessarily know how yet. One crucial aspect of commons thinking 
is that only the rules that are actually enforced count ("rules in use"). 
We're going to want rules that we can prove work, not pinky promises. (I 
think this is a better framing than the "mostly technical" requirement, 
even though it's often the same.) So being on the same page as to which 
technologies work for what strikes me as particularly useful. It's like a 
shared toolbox. I think of these techs as "ways to be opinionated." It 
doesn't mean that we put the cart before the ox.

-- 
Robin Berjon
VP Data Governance
Acting VP Marketing Analytics
The New York Times
On April 1, 2022 18:15:44 Erik Taubeneck ***@***.***> wrote:
>
> @bmayd I'm not sure I agree with your framing. Specifically:
> If folks have already determined that sensitive data is required and that 
> it will be available
> I think (and am assuming) that we have broad consensus that individual 
> cross site behavior data is considered sensitive data, and we have a number 
> of measurement proposals (Attribution Reporting API, IPA, PCM) which 
> construct some way of making that available.
> To be clear, the diagram above is not meant to be the framework for 
> everything this group does, but it does seem to be a helpful abstraction 
> for a common pattern that emerges in some of the existing proposals within 
> this space.
> I would find it very helpful to preface discussion of the technologies with 
> review of those things.
> This review is better suited for other topics on the agenda, specifically 
> the Update on the Privacy Principles and the consensus on the charter. 
> Unless the consensus is that we will do nothing in the form of a private 
> computation, then I don't see any harm in making progress in a discussion 
> around the underlying technologies this group is already proposing being 
> leveraged.
> —
> Reply to this email directly, view it on GitHub, or unsubscribe.
> You are receiving this because you were mentioned.Message ID: 
> ***@***.***>



-- 
GitHub Notification of comment by darobin
Please view or discuss this issue at https://github.com/patcg/meetings/issues/39#issuecomment-1086415358 using your GitHub account


-- 
Sent via github-notify-ml as configured in https://github.com/w3c/github-notify-ml-config

Received on Friday, 1 April 2022 23:42:33 UTC