- From: Manish Goregaokar <notifications@github.com>
- Date: Tue, 24 Nov 2020 10:00:26 -0800
- To: w3ctag/design-reviews <design-reviews@noreply.github.com>
- Cc: Subscribed <subscribed@noreply.github.com>
Received on Tuesday, 24 November 2020 18:00:38 UTC
@asankah > Could you elaborate a bit more on how an implementation should evaluate a noising or rounding strategy? I.e. how should an implementation evaluate anonymity? At the moment, we don't have a clear idea of this: @fordacious / @thetuvix / @cabanier might though. This is one of the bits of privacy work I'd like to see as we move forward (since I consider the API surface mostly "done"). It also might be worth downgrading this to a SHOULD, since a valid choice for an implementation to make is to expose precise data but be clear about fingerprinting risks in the initial permissions prompt. -- You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub: https://github.com/w3ctag/design-reviews/issues/568#issuecomment-733143138
Received on Tuesday, 24 November 2020 18:00:38 UTC