Re: indicating 'private browsing mode' over the net (was Re: Super Cookies in Privacy Browsing mode)

On Friday 30 January 2015 9:21:31 David Singer wrote:
> > Now comes the feedback again that I mentioned earlier. On a typical site, 
> > there are up to 200 trackers and more. If you have a feedback mechanism,
> > you  know who is making promises and who is not. The machine can work
> > that out while it would be overkill for the end-user. In case the
> > feedback is that my request won't be honored, my browser can simply block
> > that GET request, or fool the server or be creative by sending them the
> > cookie from last year, or….
> 
> I think that this is interesting, but there are snags.

Yes, but the things below are not a big issue IMHO
> 
> 1. Some sites don’t, in fact, keep history data. They’d have to claim to be
> honoring the request even though they’ve had to do nothing to do that.  I
> guess that’s not a high burden. 

And they can claim to honor your request as they do not keep data. You can not 
determine whether they "keep" things once those things were requested and 
collected as you don't have access to their systems. Ok, you can guess that 
they do if they maintain state or if this ad glues on you like a piece of dog 
shit you stepped on. 

> 2. UAs can probe the sites it know are involved, but there are a number 
> of sites that are invisible: proxies and transparent caches
>   sites that receive relay requests from the sites directly contacted

Here you're definitely over-engineering it. It would be a great improvement if 
browsers would already react on the GET requests triggered by a user-loaded 
page (resource without active click or written into navigation field)
> 
> I guess we could define a ‘strong respecter’ as a top-level site that not
> only promises that they respect the request, but they require that of all
> third party sites involved as well.

I don't see a need for that. I would really like to fix the other first. 

 --Rigo

Received on Friday, 30 January 2015 10:32:41 UTC