Re: The Rubber meets the Road - DNT compliance code

Is this UA mobile? 

I've spent 6 years dealing with Mobile UA's (including building two mobile
browser) and 16 years with server side detection (mod_gzip and more) – if
only it was that easy.

:)


Peter
___________________________________
Peter J. Cranstone
720.663.1752


From:  "Ian Fette   (イアンフェッティ)" <ifette@google.com>
Reply-To:  <ifette@google.com>
Date:  Wednesday, June 13, 2012 9:26 AM
To:  Peter Cranstone <peter.cranstone@gmail.com>
Cc:  W3 Tracking <public-tracking@w3.org>
Subject:  Re: The Rubber meets the Road - DNT compliance code

> Or, "Is this UA mobile? If so, redirect to the mobile site"
> 
> Again, this is neither new nor hard.
> 
> On Wed, Jun 13, 2012 at 8:22 AM, Ian Fette (イアンフェッティ) <ifette@google.com>
> wrote:
>> Many websites already do this -- "serve this JS to this user agent". It is
>> neither complex nor hard.
>> 
>> 
>> On Wed, Jun 13, 2012 at 7:44 AM, Peter Cranstone <peter.cranstone@gmail.com>
>> wrote:
>>> All,
>>> 
>>> There's a lot of questions around a non-compliant UA sending a DNT header.
>>> There's still no definition on the forum or the spec on what constitutes a
>>> non compliant UA, or even who is going to maintain a "blacklist" of those
>>> non-compliant UA's.?Finally there's no description of a message that should
>>> be sent back to the consumer indicating that he's using a non-compliant UA.?
>>> 
>>> So I'm posting a link today of what something might look like running on a
>>> server. The reason this is in PHP is because there are lot of servers (in
>>> the 10's of millions) that cannot suddenly start adding server side modules
>>> that do the detection. So it will all have to be done via a script.
>>> 
>>> Think about this for a moment. In the real world server side admins are
>>> going to have to add code to EVERY CGI script to do this. The performance
>>> hit is going to be HUGE.
>>> 
>>> Here's the link:?http://www.5o9mm.com/mod_dnt_test_1.php?
>>> 
>>> We've blacklisted the following browsers:
>>> 
>>> HTTP_DNT_BLACKLISTED_USER_AGENT_1 = Mozilla/4.0 (compatible; MSIE 7.0;
>>> Windows NT 6.1; Trident/5.0)
>>> HTTP_DNT_BLACKLISTED_USER_AGENT_2 = Mozilla/5.0 (compatible; MSIE 9.0;
>>> Windows NT 6.1; Trident/5.0)
>>> HTTP_DNT_BLACKLISTED_USER_AGENT_3 = Mozilla/4.0 (compatible; MSIE 7.0;
>>> Windows NT 6.0; Trident/5.0)
>>> HTTP_DNT_BLACKLISTED_USER_AGENT_4 = Mozilla/5.0 (compatible; MSIE 9.0;
>>> Windows NT 6.0; Trident/5.0)
>>> HTTP_DNT_BLACKLISTED_USER_AGENT_5 = Mozilla/5.0 (Windows NT 6.0; rv:8.0.1)
>>> Gecko/20100101 Firefox/8.0.1
>>> 
>>> So every time someone hits the Web site we have to run a check. The request
>>> time for this check on our server is:
>>> 
>>> REQUEST_TIME = 1339597469
>>> 
>>> For that single page. Now multiply that by every page on your Web site that
>>> is scripted. Ouch.
>>> 
>>> Now here's where it gets really interesting. Let's say that I'm on the
>>> blacklist. What does the server do? By rights it should abort the entire
>>> request and send a 400 invalid request response back to the user.
>>> 
>>> So what the heck does the user do now?
>>> 
>>> If this spec is going to be Trusted and used it has to work in the real
>>> world which is NOT 100% technical. They turn it on (or have it turned on for
>>> them) and they expect magic. They don't expect to be told that there browser
>>> is non-compliant and they can either go get another one or get tracked.
>>> 
>>> 
>>> 
>>> 
>>> Peter
>>> ___________________________________
>>> Peter J. Cranstone
>>> 720.663.1752 <tel:720.663.1752>
>>> 
>> 
> 

Received on Wednesday, 13 June 2012 15:30:51 UTC