Re: Making Linkchecker working parallel

Le vendredi 02 septembre 2005 à 18:59 +0900, olivier Thereaux a écrit :
> Some printf debugging (  print $response->as_string;) leads me to  
> think the issue was not 401s but cases of 403 (Forbidden) Forbidden  
> by robots.txt
> ... so the bug may be with LWP::Parallel::RobotUA or LWP::RobotUA  
> upstream.

I got it to work by adding                 $res->request($request);
after line 465 of RobotUA.pm; that said, I don't know if this is a bug
of Parallel::RobotUA or a too strong assumption on our side that a
response necessarily has an attached request OR a previous request
attached. Maybe we have to, I don't know.

I've attached the ultra-simplistic perl script that I have used to test
what was going wrong (instead of the full chechlink).

Dom
-- 
Dominique Hazaël-Massieux - http://www.w3.org/People/Dom/
W3C/ERCIM
mailto:dom@w3.org

Received on Monday, 5 September 2005 11:04:12 UTC