- From: BearHeart / Bill Weinman <bearheart@bearnet.com>
- Date: Mon, 19 Feb 1996 07:10:17 -0600
- To: mirsad.todorovac@fer.hr
- Cc: http-wg%cuckoo.hpl.hp.com@hplb.hpl.hp.com
At 09:43 am 2/19/96 +0100, Mirsad Todorovac spake:
>> It would be really nice if there were a response code (say, 405) for
>> "robot forbidden that URL." Technically, "forbidden" is already covered
>> through 403, but it would still be nice to have something more
>> descriptive.
There is already a method of dealing with this that takes much
less traffic than responing on a url-by-url basis.
The "robots.txt" file is described at:
http://info.webcrawler.com/mak/projects/robots/norobots.html
+--------------------------------------------------------------------------+
| BearHeart / Bill Weinman | BearHeart@bearnet.com | http://www.bearnet.com/
| Author of The CGI Book -- http://www.bearnet.com/cgibook/
Received on Monday, 19 February 1996 05:32:01 UTC