Re: looking for feedback - robots implementation for link checker

Susan, thanks for your feedback.
I am copying the developers' list in my reply, hoping it is fine with  
you.

On Jun 30, 2004, at 7:47, Susan Lesch wrote:
>  With default settings, for http://www.w3.org/ that checked "1  
> document in 160.4 seconds" vs. 8.5 seconds for ,checklink.

Indeed, you are right.

My understanding of the code changes between the current :80 and :8001
http://dev.w3.org/cvsweb/perl/modules/W3C/LinkChecker/bin/ 
checklink.diff?r1=3.58&r2=3.17&f=h
is that the sleep_time parameter is now used not only as a delay  
between checking documents in recursive mode, but also as a delay  
between HEAD requests checking links from a given document.

That (as well as the robots implemntation, BTW)  was, IIRC, decided as  
a mean to respond to critics saying that the link checker was a  
"misbehaving" agent. OTOH, obviously, it does make it very slow for  
documents with a lot of links. We should try to find a good tradeoff (I  
think the idea of having a specific intra-w3c instance without such  
limitations was one idea) but this is a difficult balance to achieve.

Thanks
-- 
olivier

Received on Tuesday, 29 June 2004 19:34:53 UTC