[meetings] Agenda Request - Methods for measuring privacy risk (#129)

grahammudd has just created a new issue for https://github.com/patcg/meetings:

== Agenda Request - Methods for measuring privacy risk ==
## Agenda+: What do you want to discuss?

Across the existing and proposed privacy APIs, we see a wide range of anonymization techniques employed (k-anonymity, differential privacy, limiting entropy). Even within a single technique such as differential privacy, there isn’t a standardized implementation. As an example, the privacy unit differs between ARA and IPA. As this group has discussed, solutions that are able to maximize both privacy and utility will likely require a mix of several techniques.

Given that, we propose organizing a sub-group focused on methods for measuring privacy risk (or privacy creation). As an example [this recent paper](https://arxiv.org/abs/2304.07210) from the google sandbox team looks to measure the risk of re-identification in the topics API. The thinking is, if we can standardize on a framework for measuring privacy across techniques, we will accelerate the industry’s understanding and ultimate acceptance of these tools.

Goals:  1) Align on whether a sub-group effort would be productive, 2) align on goals for project 3) call for participants 

Facilitators:  @tgreasby & @grahammudd

## Time

30 mins

### Links

- https://arxiv.org/abs/2304.07210

Please view or discuss this issue at https://github.com/patcg/meetings/issues/129 using your GitHub account


-- 
Sent via github-notify-ml as configured in https://github.com/w3c/github-notify-ml-config

Received on Thursday, 8 June 2023 16:17:44 UTC