Re: Input on threat model from browser privacy summit slides

On Fri, Feb 21, 2020 at 6:17 PM David Singer <singer@apple.com> wrote:
>
>
>
> > On Feb 21, 2020, at 10:12 , Joshue O Connor <joconnor@w3.org> wrote:
> >
> >> "Yes, but...". I'll make a stronger statement and say "user
> >> customizable preferences should either 1) not cause a reduction in the user's
> >> privacy, or 2) explicitly warn users if adjusting that setting will reduce
> >> their privacy".
> >
> > +1.  Also note it would be just weird if turning on some a11y customisation did have the unwanted side affect of compromising their privacy. But this is what we need to make sure doesn't happen. Weirder if they were told this was the case beforehand.
> >
>
> The obvious way this happens is that turning on any option — not just accessibility — differentiates you from the default of the population and if it’s detectable, increases your fingerprint surface. Is that something we need to warn about?

In theory, yes, we should. I should distinguish between configuration options
that are presented on a user-facing screen/panel, and the under-the-hood config
options/preferences. It's probably better if we only focus on the user-facing
preferences. A browser should know when changing some-combination-of settings
will result in changing its fingerprint (within known fingerprinting vectors).

Received on Friday, 21 February 2020 19:29:23 UTC