W3C home > Mailing lists > Public > www-validator@w3.org > July 2007

Re: [checklink] How to override robots.txt?

From: Michael Adams <linux_mike@paradise.net.nz>
Date: Wed, 18 Jul 2007 21:14:16 +1200
To: www-validator@w3.org
Message-id: <20070718211416.6f14c43d.linux_mike@paradise.net.nz>

On Tue, 17 Jul 2007 12:39:49 +0200
Stephan Windmüller wrote:

> Hello!
> 
> We want to use checklink to search for broken links on our site. Of
> course there should be only a warning mail when something went wrong.
> Unfortunately this is also true when external links are disallowed by
> a robots.txt.
> 
> Is it possible to override or disable the robots.txt-feature? If not,
> how can I prevent checklink to list these adresses as errors when
> using"-q"?
> 

Why not FTP into your site and download then delete robots.txt
temporarily? Upload again to restore when finished.


-- 
Michael
Linux: The OS people choose without $200,000,000 of persuasion.
Received on Wednesday, 18 July 2007 09:13:52 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 25 April 2012 12:14:25 GMT