- From: Stephan Windmüller <windy@white-hawk.de>
- Date: Tue, 17 Jul 2007 12:39:49 +0200
- To: www-validator@w3.org
Received on Tuesday, 17 July 2007 18:14:35 UTC
Hello! We want to use checklink to search for broken links on our site. Of course there should be only a warning mail when something went wrong. Unfortunately this is also true when external links are disallowed by a robots.txt. Is it possible to override or disable the robots.txt-feature? If not, how can I prevent checklink to list these adresses as errors when using "-q"? TIA Stephan
Received on Tuesday, 17 July 2007 18:14:35 UTC