- From: Michael Adams <linux_mike@paradise.net.nz>
- Date: Wed, 18 Jul 2007 21:14:16 +1200
- To: www-validator@w3.org
On Tue, 17 Jul 2007 12:39:49 +0200 Stephan Windmüller wrote: > Hello! > > We want to use checklink to search for broken links on our site. Of > course there should be only a warning mail when something went wrong. > Unfortunately this is also true when external links are disallowed by > a robots.txt. > > Is it possible to override or disable the robots.txt-feature? If not, > how can I prevent checklink to list these adresses as errors when > using"-q"? > Why not FTP into your site and download then delete robots.txt temporarily? Upload again to restore when finished. -- Michael Linux: The OS people choose without $200,000,000 of persuasion.
Received on Wednesday, 18 July 2007 09:13:52 UTC