Suggestion for W3C Link Checker

Thanks for the broken link finder https://validator.w3.org/checklink. I 
have a suggestion.

Each of the files on the site I'm checking has a link to a particular 
file. That file is blocked in robots.txt. For every file the link 
checker reads, there's an error saying:

         Status: (N/A) Forbidden by robots.txt
         The link was not checked due to robots exclusion rules. Check
    the link manually.

Checking manually has the same effect as if the link checker checks it: 
in both cases, the file is accessed. So I suggest changing the behavior 
to access the file anyway. (You might add an option that tells the link 
checker to behave as it does not: not check links blocked by robots.txt.)

Jerry Peek

Received on Sunday, 30 January 2022 20:48:09 UTC