Re: Input on threat model from browser privacy summit slides

On Thu, Feb 20, 2020 at 2:33 PM Joshue O Connor <joconnor@w3.org> wrote:
>
> Nala Ginrut wrote on 20/02/2020 13:52:
> > Hi Maciej Stachowiak and all contributors!
> > Thanks for all the work!
> > I'd like to share some comments here:
> >
> > 1. "Benign information disclosure..system preferences [like dark mode]"
> > Do we really care about that someone may know what theme we are using?
> While avoiding saying outright that this is the case here ... *If* it is
> a way of fingerprinting or identifying, when used with a combination of
> other accessibility related changes in the browser,
> that the person may have a disability. Then I would say yes.
>
> Many will use dark mode, just because they like it and not have any
> accessibility need, and they may not stick out from the herd etc but
> this information when used in conjunction with other unique
> configuration settings such as a larger font site, particular font
> usage/symbol sets etc may be used to identify users with disabilities.

"Yes, but...". I'll make a stronger statement and say "user
customizable preferences should either 1) not cause a reduction in the user's
privacy, or 2) explicitly warn users if adjusting that setting will reduce
their privacy". For this specific case, if a user adjusts their chrome/browser
theme such that it uses a "dark theme" instead of a "light theme", and this
adjustment does not modify any part of a site's content, then this should have
zero privacy implications. I don't know how Chrome's content dark mode is
implemented, so that may change a browser's fingerprint.

Similarly, I would like to live in a world where adding (something like) a
"Bookmarks toolbar" in the chrome doesn't change a browser's fingerprint, but
something simple like this has real implications right now.

Received on Friday, 21 February 2020 19:30:06 UTC