W3C home > Mailing lists > Public > ietf-http-wg-old@w3.org > January to April 1996

Re: New response code

From: BearHeart / Bill Weinman <bearheart@bearnet.com>
Date: Mon, 19 Feb 1996 12:06:44 -0600
Message-Id: <2.2.32.19960219180644.00677098@204.145.225.20>
To: Shel Kaphan <sjk@amazon.com>
Cc: mirsad.todorovac@fer.hr, http-wg%cuckoo.hpl.hp.com@hplb.hpl.hp.com
At 08:36 am 2/19/96 -0800, Shel Kaphan spake:
>Some applications would generate pages differently if they are being
>probed by a robot.  For instance, in applications that use URL

>So, I'd like to propose that robots be allowed to identify themselves
>as such by including a simple header line in requests, which ought to
>be passed along to CGI programs.  The header could just be "robot: true"

   I don't see that this is necessarily an issue for HTTP. If you 
are interested in pursuing it, I would suggest that you take it up 
with the Robots mailing list. Here's their subscription info: 

        "To subscribe to this list, send a mail message to 
         robots-request@webcrawler.com, with the word subscribe 
         on the first line of the body."

   If all those folks find they have a pressing need for a new 
header field in HTTP, you may then have the necessary leverage over 
here. OTOH, I think your problem could easily be solved if the 
robot authors would agree to put the word "ROBOT" somewhere in their 
User-Agent headers. It seems just as effective as creating a new 
header field, and you would need their cooperation either way. 


+--------------------------------------------------------------------------+
| BearHeart / Bill Weinman | BearHeart@bearnet.com | http://www.bearnet.com/ 
| Author of The CGI Book -- http://www.bearnet.com/cgibook/ 
Received on Monday, 19 February 1996 10:12:48 EST

This archive was generated by hypermail pre-2.1.9 : Wednesday, 24 September 2003 06:31:45 EDT