[proposals] A proposal for privacy preserving ad attribution measurement using Prio-like architecture (#17)

winstrom has just created a new issue for https://github.com/patcg/proposals:

== A proposal for privacy preserving ad attribution measurement using Prio-like architecture ==
We have a proposal to allow measuring attribution of advertisements with privacy guarantees. 

We try to build on previous privacy proposals such as [<u>Private Click Measurement</u>](https://webkit.org/blog/11529/introducing-private-click-measurement-pcm/) (PCM) , [<u>Interoperable Private Attribution</u>](https://github.com/patcg-individual-drafts/ipa/blob/main/IPA-End-to-End.md) (IPA), and [<u>Attribution Reporting API with Aggregatable Reports</u>](https://github.com/WICG/attribution-reporting-api/blob/main/AGGREGATE.md) (ARA). Our goal at each stage is to only transmit the minimum information necessary to perform the attribution measurement and nothing else.  

Like PCM, we rely on the user’s device to join an advertisement impression and conversion together. This means that the browser is trusted with the event level information on user interactions and joins them into summaries of attribution represented as *histograms*. These histograms only contain the attribution value of a conversion rather than a browsing history.  

Like IPA, we rely on Multi-Party Computation (MPC) frameworks to cryptographically segment data across multiple computation partners so that no individual organization can track an individual. This system is used for both aggregation and to introduce Differentially Private Noise and ensure that there is a well defined privacy loss bound for each user of the system. We rely on the Prio ([<u>Prio | Stanford Applied Crypto Group</u>](https://crypto.stanford.edu/prio/)) framework to perform this multi-party aggregation. We rely on Differential Privacy (([<u>Differential Privacy</u>](https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/dwork.pdf), [<u>The Algorithmic Foundations of Differential Privacy</u>](https://www.cis.upenn.edu/~aaroth/Papers/privacybook.pdf))) to add appropriate noise to attribution calculations to make them private.  

Like ARA, we wish to allow measurements across a large, sparse space defining the potential linkages between advertisers and publishers. We present a concrete way to encode this sparse space using dense histograms so that individual contributions can be aggregated using known MPC approaches.  

For a readable overview, we provide an [explainer](https://github.com/patcg/proposals/files/12449973/private_ad_measurement_explainer.md)

We also provide a [document with more details](https://github.com/patcg/proposals/files/12449974/private_ad_measurement_details.md)


Please view or discuss this issue at https://github.com/patcg/proposals/issues/17 using your GitHub account


-- 
Sent via github-notify-ml as configured in https://github.com/w3c/github-notify-ml-config

Received on Monday, 28 August 2023 03:35:44 UTC