W3C home > Mailing lists > Public > www-validator@w3.org > July 2007

Re: [checklink] How to override robots.txt?

From: Olivier Thereaux <ot@w3.org>
Date: Thu, 19 Jul 2007 08:31:34 +0900
To: Ville Skyttä <ville.skytta@iki.fi>
Cc: www-validator@w3.org
Message-ID: <20070718233134.GA32462@w3.mag.keio.ac.jp>

Hi Ville, Hi Stephan.

On Wed, Jul 18, 2007, Ville Skyttä wrote:
> On Tuesday 17 July 2007, Stephan Windmüller wrote:
> 
> > Is it possible to override or disable the robots.txt-feature?
> 
> Not from the link checker side without modifying the code.  Perhaps the target 
> site administrators would be willing to allow the link checker to crawl it?  

What do you think about the idea, though?

Since -q is about the output of sole errors:
-q, --quiet  No output if no errors are found (implies -s).
and a link not checked because of a robots.txt directive is not per se
an error (just informing the user that the link was not checked),
shouldn't we modify checklink there?

-- 
olivier
Received on Wednesday, 18 July 2007 23:31:39 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 25 April 2012 12:14:25 GMT