W3C home > Mailing lists > Public > public-privacy@w3.org > January to March 2015

Re: indicating 'private browsing mode' over the net (was Re: Super Cookies in Privacy Browsing mode)

From: David Singer <singer@apple.com>
Date: Fri, 30 Jan 2015 09:21:31 +0100
Cc: "public-privacy mailing list) (W3C" <public-privacy@w3.org>
Message-id: <22F3E5DF-5901-44EC-9D98-D0BDAA22E076@apple.com>
To: Rigo Wenning <rigo@w3.org>

> On Jan 29, 2015, at 20:43 , Rigo Wenning <rigo@w3.org> wrote:
> 
> trimming the cc - list..
> 
> On Thursday 29 January 2015 19:24:45 David Singer wrote:
>>> It would have to include all the servers being accessed, third-parties
>>> also. I think David's header would be seen all of them, and it would only
>>> take one to ignore the contextual boundaries, decide to combine multiple
>>> personas with other data in a PII keyed database, then broadcast it to
>>> the world (and UA based UUIDs are far more reliably user-identifying than
>>> IP addresses which are usually ephemeral and non-unique). 
>> True, but don’t forget we’re coming from a state where the servers don’t
>> even know of the desire.  I don’t mind machine-based discoverability, but
>> it’s tricky to work out how to include transparent proxies and caches in
>> that.
> 
> Now comes the feedback again that I mentioned earlier. On a typical site, 
> there are up to 200 trackers and more. If you have a feedback mechanism, you 
> know who is making promises and who is not. The machine can work that out 
> while it would be overkill for the end-user. In case the feedback is that my 
> request won't be honored, my browser can simply block that GET request, or 
> fool the server or be creative by sending them the cookie from last year, 
> or….

I think that this is interesting, but there are snags.

1. Some sites don’t, in fact, keep history data. They’d have to claim to be honoring the request even though they’ve had to do nothing to do that.  I guess that’s not a high burden.
2. UAs can probe the sites it know are involved, but there are a number of sites that are invisible:
  proxies and transparent caches
  sites that receive relay requests from the sites directly contacted

I guess we could define a ‘strong respecter’ as a top-level site that not only promises that they respect the request, but they require that of all third party sites involved as well.

David Singer
Manager, Software Standards, Apple Inc.
Received on Friday, 30 January 2015 08:22:10 UTC

This archive was generated by hypermail 2.3.1 : Friday, 30 January 2015 08:22:11 UTC