Mod_DNT

Mod_DNT
 

A quick update to the status of Mod_DNT.
 

Pre-amble: 

The reason for this pre-amble is so that people donšt waste time attacking
my credibility when it comes to shipping real code that implements a spec
(accept encoding=gzip)
 

My partner and I are the inventors of Mod_Gzip which is the defacto
standard for content acceleration on the web. It is currently installed
globally on 10šs of millions of Apache servers.  We know how to accelerate
content to a browser without slowing down a web server. In addition we
know just about every parsing trick in the book, how to write very, very
fast routines, and do all of this with minimal impact to the server.
 

There is already a new version of Mod_Gzip which incorporates the DNT
blacklist capability and will load a remote file from any location which
includes the updated list of blacklisted UAšs, and store it in memory.
 

We also have a PHP version of Mod_DNT which allows you to read from a
remote file which could be the official W3 DNT blacklist. The reason we
wrote this is for those admins who simply cannot add a Module to their
Apache servers and/or who may prefer to use PHP AND who have to support
the spec.
 

Again ­ all of this can be performance enhanced (and we know how to do
it). However as we have learned over many years, itšs always easy to make
something go fast. The hard part is ensuring that you can actually
implement it in the first place in EVERY sort of environment.

/End preamble.
 

There is a new version of mod_dnt.php running up at 5o9mm.comŠ
 
http://www.5o9mm.com/mod_dnt_test_1.php
 

It is still doing the rational thing and expecting there to be an
'official/up-to-date' DNT BLACKLIST somewhere online. Still only a minimum
of 5 'Blacklisted' User-Agent strings in the 'official' remote BLACKLIST
file.
 

Still averages 340 to 360 milliseconds PER REQUEST to do the 'DNT
Verification' in this manner.

 
New headers added areŠ
 

HTTP_DNT_BLACKLIST_TYPE = REMOTE FILE ( Simulated Official/Current W3C
Online BLACKLIST )
HTTP_DNT_BLACKLIST_LOCATION =
http://itm1.ismysite.co.uk/mod_dnt_blacklisted_user_agents.txt
 

Multiply 340 milliseconds ( per request ) by even just 10 'call backs' to
assemble your existing HTTP pageŠ and you can add 3.4 seconds of 'load
time' to your page just because you are 'trying' to be DNT compliant.

 
To be workable in the real world server admins are going to need to get
EVERYTHING down to the sub millisecond time to read the incoming request,
parse it, make the call backs as you assemble the page, and in the case of
the UA being NON COMPLIANT sending a 400 message back to the user asking
for more clarification on what should happen next.

 
The Apache Module version (Mod_Gzip(2)) already does all of this very,
very quickly. 

But thatšs still NOT the core problem.
 

Do I even need Mod_DNT in the first place?
 

It only takes one vendor to implement the spec differently (Microsoft) and
regardless of anything else you will HAVE to check every single incoming
UA for even the presence of an invalid header. No matter how good your
server admins are, therešs going to be a performance hit. (Think Mobile
here).

 
And still there is NO idea of what the server response looks like if
indeed it is an invalid UA.

See Heatheršs email on the subject ŗUI and Scope˛
http://lists.w3.org/Archives/Public/public-tracking/2012Jun/0458.html
 
 
 

Roy is 100% correct. At some point someone is going to have to implement
this spec. If it does NOT scale to every environment and EVERY screen size
then it will as Nija says "simply be ignored".



Peter
___________________________________
Peter J. Cranstone
720.663.1752

Received on Saturday, 16 June 2012 17:25:32 UTC