- From: Fred Andrews <fredandw@live.com>
- Date: Fri, 7 Dec 2012 02:05:16 +0000
- To: Nicholas Doty <npdoty@w3.org>, "public-privacy (W3C mailing list)" <public-privacy@w3.org>
- Message-ID: <BLU002-W102AB42C51DEDF37CFC00B7AA440@phx.gbl>
Dear Nicholas, Thank you following up on this privacy issue. I suggest removing any discussion of 'Personal safety and anonymous browsing', as it does not seem relevant. For example, we all accept that losing security of our web banking signin details is a threat and do not require a list of stories showing how this could be damaging. The 'show us the harm' line is just a counter tactic and there is no need to answer it and if weight is placed on such stories then people are likely to dispute them and then write off the entire threat. I suggest also noting that fingerprinting is one threat from UA state leakage and that even if fingerprinting is conceded then consideration of the security of the UA state is still important - users may not want to share their location, or that their UA has a camera API, or the plugins they use, etc. Javascript can also be fingerprinted, see: * [http://w2spconf.com/2011/papers/jspriv.pdf Fingerprinting Information in JavaScript Implementations ] Keaton Mowery, Dillon Bogenreif, Scott Yilek, and Hovav Shacham, 2011. A list of resources is also being built on the PUA CG Wiki, but could be moved elsewhere: http://www.w3.org/community/pua/wiki/Main_Page One option being explored by the PUA CG is to limit back channels and this can reduce the risk of active fingerprinting. So long as the outgoing information is restricted to be plausibly consistent with a user using a UA with JS disabled then such leaks are minimized. Content authors could still send JS that probes a UAs implementation and then uses this information within the UA to discriminate, but the state would not be leaked to the web, and the user could attempt a defense in private. The Panopticlick (JS enabled) test currently locks up when run under such restrictions, but gives the same result as when JS is disabled otherwise. An encouraging number of rich JS based webpages still work well under such restrictions. Trying to 'sell' a concern for privacy is a challenge. For example, consider 'web intents', this has great potential for improving the security of rich UA experiences by allowing the user to redirect common functions to trusted sites or apps. This could replace social widgets by trusted code that does not track users. However 'web intents' is not designed to fully exploit these benefits and the interest groups pushing it have no business interest in limiting tracking. Is there a diplomatic way to swing this around and make web intents support declarative markup so that web intents could be used on a page with JS disabled to still implement rich social widgets, payment solutions, text editors, html editors, image editors, form completion, spelling correction, etc, and all using user trusted code that meets the users choices regarding fingerprinting and tracking? You might be interested in feedback from Ian Hickson regarding some of the PUA CG options. While I appreciate he even took the time to reply, and will work to better communicate the concerns and options, the tone of the reply suggests this will be quite a challenge. BTW I support an environment where people are free and safe to comment and respect Ian's input. http://lists.whatwg.org/pipermail/whatwg-whatwg.org/2012-November/038038.html cheers Fred > From: npdoty@w3.org > Date: Tue, 4 Dec 2012 17:21:58 -0800 > To: public-privacy@w3.org > Subject: skeleton draft regarding fingerprinting guidance > > Hi all, > > Inspired by conversations at the TPAC breakout session on fingerprinting, I've started an outline/draft of a document for giving positive guidance to spec authors about what fingerprinting is exactly and how we might address it across specs. > > As you can see, this is a mostly empty outline and obviously just a beginning, and I'm certainly not wedded to any of it. But I thought it might be a good basis for conversation, perhaps on this week's conference call, or just on the list. In particular, documenting the different threats or different levels of success sounded like it would be useful for spec authors who we hear are already thinking about this balancing act. > > Thanks in advance for all your thoughts, > Nick > > P.S. Written in Markdown, forgive me if you don't like this syntax. I'm happy to throw this on the wiki or on github if people would like to collaborate on it actively. > > > # Fingerprinting Guidance for Specification Authors > > In short, browser fingerprinting is: > > the capability of a site to identify or re-identify a visiting user, user agent or device via configuration settings or other observable characteristics. > > (A more detailed list of types of fingerprinting is included below.) > > ## Privacy threat models > > Browser fingerprinting is a potential threat to privacy on the Web. This document does not attempt to provide a single unifying definition of privacy, but we note concerns about loss of anonymity and unexpected correlation of online activity. > > Following from the practice of security threat model analysis, we note that there are distinct models of privacy threats for fingerprinting. Defenses against these threats differ, depending on the kind of user and concern. > > * Personal safety and anonymous browsing: > > > For some users, personal physical safety can be impacted if their online activities can be associated with their real-world identity -- for example, a political author under an unfriendly regime. Correlation of activity across sites (using a common fingerprint) might allow an attacker to connect a name to an online pseudonym. Such users might employ onion routing systems such as Tor to limit network-level linkability but still face the danger of browser-fingerprinting to correlate their Web-based activity. > > * Unexpected correlation of browsing activity: > > > Fingerprinting provides privacy concerns even when real-world identities are not implicated. Some users may be surprised or concerned that an online party can correlate multiple visits (on the same or different sites) to develop a profile or history of the user. This concern is heightened because tools such as clearing cookies do not prevent or "re-set" correlation done via browser fingerprinting. > > There are also different levels of success in addressing browser fingerprinting: > > * Decreased fingerprinting surface: > * Increased anonymity set: > * Client-preventable fingerprinting: > * Externally detectable fingerprinting: > > ## Types of fingerprinting > > ### Passive > > ### Active > > ### Cookie-like (setting/retrieving local state) > > ## Mitigations and guidance > > ### Weighing increased fingerprinting surface > > ### A standardized profile? > > ### Do Not Track: a cooperative approach > > ## Research > > [What are the key papers to read here, historically or to give the latest on fingerprinting techniques? What are some areas of open research that might be relevant?] > > ## References
Received on Friday, 7 December 2012 02:05:46 UTC