W3C home > Mailing lists > Public > public-qa-dev@w3.org > September 2005

Re: Making Linkchecker working parallel

From: Dominique HazaŽl-Massieux <dom@w3.org>
Date: Mon, 05 Sep 2005 13:04:01 +0200
To: olivier Thereaux <ot@w3.org>
Cc: QA Dev <public-qa-dev@w3.org>
Message-Id: <1125918241.8067.36.camel@stratustier>
Le vendredi 02 septembre 2005 ŗ 18:59 +0900, olivier Thereaux a ťcrit :
> Some printf debugging (  print $response->as_string;) leads me to  
> think the issue was not 401s but cases of 403 (Forbidden) Forbidden  
> by robots.txt
> ... so the bug may be with LWP::Parallel::RobotUA or LWP::RobotUA  
> upstream.

I got it to work by adding                 $res->request($request);
after line 465 of RobotUA.pm; that said, I don't know if this is a bug
of Parallel::RobotUA or a too strong assumption on our side that a
response necessarily has an attached request OR a previous request
attached. Maybe we have to, I don't know.

I've attached the ultra-simplistic perl script that I have used to test
what was going wrong (instead of the full chechlink).

Dom
-- 
Dominique HazaŽl-Massieux - http://www.w3.org/People/Dom/
W3C/ERCIM
mailto:dom@w3.org





Received on Monday, 5 September 2005 11:04:12 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Thursday, 19 August 2010 18:12:45 GMT