Re: [docs-and-reports] Privacy and Purpose Constraints (#15)

> Typically, I think a purpose in this context would be "measurement of purchases resulting from advertising" not "to calculate a result of a certain class of function in aggregate form with a differentially private guarantee". 

This makes sense, however I think we are mixing up the threat model for the PATCG/WG and the specific _private measurement_ spec. For the threat model, our goal is to consider proposals which offer technical means of purpose limitation (for lack of a better name at the moment.) For the _private measurement_ spec, the purpose is something like "differentially private measurement of conversions resulting from advertising impressions." 

In that sense, the goal of the threat model is to help evaluate whether a specific proposal actually provides purpose limitation of its specified purpose through technical guarantees.

> The purpose of the data collected or how it's being used is not specified in explicit terms to the user as part of the technical guarantee of the cryptographic design of the system.

I think this depends on what is meant by "the data collected". For the _private measurement_ spec, if it's the (typically encrypted) data leaving the client, then I'd argue that it is in fact specified as the purpose above (and the system limits it to that.) If "the data collected" is the aggregates which come out of the private computation system, then I agree that it's not specified. That said, I think you can always make that argument and it's turtles all the way down.

-- 
GitHub Notification of comment by eriktaubeneck
Please view or discuss this issue at https://github.com/patcg/docs-and-reports/issues/15#issuecomment-1286004834 using your GitHub account


-- 
Sent via github-notify-ml as configured in https://github.com/w3c/github-notify-ml-config

Received on Thursday, 20 October 2022 19:00:08 UTC