- From: Ville Skyttä <ville.skytta@iki.fi>
- Date: Wed, 18 Jul 2007 19:45:12 +0300
- To: www-validator@w3.org
On Tuesday 17 July 2007, Stephan Windmüller wrote: > Is it possible to override or disable the robots.txt-feature? Not from the link checker side without modifying the code. Perhaps the target site administrators would be willing to allow the link checker to crawl it? More info: http://validator.w3.org/docs/checklink.html#bot > If not, how can I prevent checklink to list these adresses as errors when > using "-q"? I'm afraid either modifying the link checker code or postprocessing its output eg. with sed or perl are the only ways it can be currently done.
Received on Wednesday, 18 July 2007 16:46:15 UTC