- From: Stephan Windmüller <windy@white-hawk.de>
- Date: Wed, 18 Jul 2007 11:28:03 +0200
- To: www-validator@w3.org
Received on Wednesday, 18 July 2007 09:28:32 UTC
On Wed, 18. Jul 2007, Michael Adams wrote: > > We want to use checklink to search for broken links on our site. Of > > course there should be only a warning mail when something went > > wrong. Unfortunately this is also true when external links are > > disallowed by a robots.txt. > Why not FTP into your site and download then delete robots.txt > temporarily? Upload again to restore when finished. The problem is not our robots.txt but those of external sites. - Stephan
Received on Wednesday, 18 July 2007 09:28:32 UTC