Re: [csswg-drafts] [css-fonts] incorporate mitigations for font based fingerprinting (#4055)

> I haven't see a standard for it, any specifics of thresholds or empirical observations of it being a useful privacy protection strategy. 

I assume you read the explainer I linked? Note that this is also something we're actively working on and developing; it's *far* from complete so far.

> Further, since a unique font generally puts someone in an extremely small equiv class by itself (w/o needing to be combined with other inputs), its unclear how a privacy budget approach would be useful here.

A single unique font probably does, yeah. How do you expect a website to *find* that single unique font that the user has? If it's highly identifying, that means only a small number of people have it. So either the website is *only* targetting those handful of people and is thus testing only for that font (interesting case...) or they're testing lots of "unique" fonts to see *which* small bucket the user falls in. The latter is exactly what the Privacy Budget approach is intended to detect - spamming hundreds or thousands of local font requests looking for the one that highly identifies the user.

> Put differently, users are being harmed today by this flaw in the font standard. It seems inappropriate to hinge the solution to that problem to something that isn't anywhere close to standardization (i.e. privacy budget).

And as others have argued in this thread, users will be harmed by the suggestion to restrict local font access to solely system fonts. (And aren't currently harmed by Safari's actions due to the differences in user demographics between browsers.) We need to think about the balance of benefits, harms, and costs of mitigating those harms.

As I argued in TPAC, and Chris Wilson and others at Google argued in their response to PING's charter discussion, the web is *chock full* of data that can be used for fingerprinting. Any attempt to reduce that, particularly any attempt with significant user-harmful side effects, needs to show that it'll actually reduce the fingerprinting surface to a usefully low level; going from 400 bits to 40 bits of identifying information achieves precisely nothing, since you only need 33 bits to uniquely identify every person on Earth. (And you really want to allow less than 20 bits, to ensure that people are "bucketed" together with at least several thousand others.)

If the PING can show that the sum of their suggested mitigations will reduce fingerprinting surface to 20 bits or less, or at least that there's a believeable path to getting under that limit, and that performing all of those mitigations will not harm the web to such an extent that the attack surface just moves elsewhere (such as sites moving to native apps...), then great! That would be an ideal solution, because reducing information wholesale is typically far easier than trying to be clever!

So far, the PING hasn't attempted to show that it's possible to do that. And so far, Chrome's security engineers don't believe it's possible to reasonably do an absolute fingerprinting reduction, either. Thus Privacy Budget, our attempt to dynamically enforce a pay-as-you-go budget that, hopefully, will let us prevent attacks (like scanning the user's local fonts) without harming legitimate uses (like using a handful of local fonts to actually render text).

I think you should do more than dismiss Privacy Budget out-of-hand; it's a serious effort to actually solve fingerprinting across the entire web platform, not an attempt to deflect attention. The math is clear here: this isn't a problem that can be solved with band-aids, and even knowing if your efforts will achieve anything at all requires a serious analysis of the whole attack surface; standard defense-in-depth security intuitions don't apply, at least not with the current state of things.

So, as Chris Wilson said, without a formal model showing that this change is part of a combined effort that will achieve a useful result, Chrome will continue to be against it, and will instead pursue methods like I described to achieve *useful* fingerprinting reduction. Harming users and webdevs for what is currently just a fig-leaf is not something we're interested in.

-- 
GitHub Notification of comment by tabatkins
Please view or discuss this issue at https://github.com/w3c/csswg-drafts/issues/4055#issuecomment-536063793 using your GitHub account

Received on Friday, 27 September 2019 19:11:41 UTC