- From: Olivier Thereaux <ot@w3.org>
- Date: Thu, 19 Jul 2007 08:31:34 +0900
- To: Ville Skyttä <ville.skytta@iki.fi>
- Cc: www-validator@w3.org
Hi Ville, Hi Stephan. On Wed, Jul 18, 2007, Ville Skyttä wrote: > On Tuesday 17 July 2007, Stephan Windmüller wrote: > > > Is it possible to override or disable the robots.txt-feature? > > Not from the link checker side without modifying the code. Perhaps the target > site administrators would be willing to allow the link checker to crawl it? What do you think about the idea, though? Since -q is about the output of sole errors: -q, --quiet No output if no errors are found (implies -s). and a link not checked because of a robots.txt directive is not per se an error (just informing the user that the link was not checked), shouldn't we modify checklink there? -- olivier
Received on Wednesday, 18 July 2007 23:31:39 UTC