- From: Rigo Wenning <rigo@w3.org>
- Date: Wed, 20 Mar 2013 22:44:46 +0100
- To: Adrian Bateman <adrianba@microsoft.com>
- Cc: "public-tracking@w3.org Working Group" <public-tracking@w3.org>
On Wednesday 20 March 2013 15:26:44 Adrian Bateman wrote: > I don't believe most "everyday" browsers will take the performance hit > of downloading the TSR. This is sad for privacy on the Web. A browser that can read a chain of CSS files, that has client side storage for all kinds of information, that can wait until an entire industry has done an auction on a targeted user selected from a million profiles, is not capable of reading, parsing and using a 1k chunk of information and provide useful information to the user? Performance is a pseudo-argument and it will remain one. If the very little information in the TSR has a problem of performance or scalability, there must be a way to solve it. Or do you say that the web knowledge around the table has failed to deliver a scalable solution? If so, what is the problem? You need a round trip? Blame Roy! A header would not have needed any more round trip. Or is the TSR not needed because the engineers do not believe into the legal dependencies of all this? In 2002 the browser makers believed that cookie blocking is much better than transparency. This has utterly failed. Are we repeating history? If all we do is spawning DNT:1 headers to cut down services, that would have been much simpler. If all we do is creating a javascript api to populate a store with alleged exceptions to cookie blocking, it would have been much simpler. Performance was already Microsoft's argument for P3P compact policies with the known result. Is Microsoft again giving up half way? I can't really believe that this arguments comes back after the experience we had with those half baked things like compact policies. My hope remains that firefox, opera and webkit will do better than that. --Rigo
Received on Wednesday, 20 March 2013 21:45:10 UTC