Re: W3c Link Checker

Hi Gary, 
 
The problem is that the link checker pages really do have to be
forbidden to robots and especially including to the W3 robot which
checks the links.
 
People often link to the output page from the linkchecker directly from
the webpage which has been linkchecked. 
 
If all the search engine robots were allowed to follow all those links
they would trigger a recalculation of the link check for each page every
time they followed the link. 
 
Worse still, the poor W3 robot would get into a circular loop when it
saw a page linking to the linkcheck page for itself; it would try the
linkcheck link and automatically call on itself to retest all the links
in the page (including adding another linkcheck link to its wait queue)
ad nauseam.
 
That make sense?
 
Andrew Cates
http://catesfamily.org.uk <http://catesfamily.org.uk/> 
 

Received on Thursday, 21 February 2008 14:55:21 UTC