- From: Eric Gorr <mailist@ericgorr.net>
- Date: Tue, 22 Mar 2005 16:29:48 -0500
- To: www-validator@w3.org
I am running checklink with: checklink -q -b -s -r -u basically trying to suppress as much output as I can since I only want to see the broken links. However, I am still getting output that looks like: http://iplab.com/resources/StdCommLib1.js Line: 15 Code: (N/A) Forbidden by robots.txt To do: The link was not checked due to robots exclusion rules. Check the link manually. Is there anyway to suppress this output as well? Ideally, what I would like to see is output that looks something like: BROKEN: <web page> <broken link> BROKEN: <web page> <broken link> BROKEN: <web page> <broken link> which is what the script: http://world.std.com/~swmcd/steven/perl/pm/lc/linkcheck.html does, but checklink seems to be a far better one in every other respect. Rather then a lot of output I don't care about like.... Processing <web page> This may take some time if the document has many links to check. Should I make this a feature request and enter it into the bugzilla database?
Received on Wednesday, 23 March 2005 01:21:02 UTC