W3C home > Mailing lists > Public > public-qa-dev@w3.org > September 2005

Re: Making Linkchecker working parallel

From: olivier Thereaux <ot@w3.org>
Date: Fri, 2 Sep 2005 18:59:00 +0900
Message-Id: <40530BB4-C16F-42F4-9769-D08953E9448C@w3.org>
Cc: QA Dev <public-qa-dev@w3.org>
To: Dominique HazaŽl-Massieux <dom@w3.org>

On 31 Aug 2005, at 22:02, Dominique HazaŽl-Massieux wrote:
> Le mercredi 31 aoŻt 2005 ŗ 19:06 +0900, olivier Thereaux a ťcrit :
>> What we have in CVS now is an (almost) functional solution based on
>> wait(). Except that this doesn't work in the case of a 401.
>
> Actually, I got version 4.23 to work with the simple document  
> attached.
> [...]
> Or to fix the bug in LWP::Parallel if we can identify it, no?

Hmm, thanks.

Some printf debugging (  print $response->as_string;) leads me to  
think the issue was not 401s but cases of 403 (Forbidden) Forbidden  
by robots.txt
... so the bug may be with LWP::Parallel::RobotUA or LWP::RobotUA  
upstream.

test doc attached.
-- 
olivier



Received on Friday, 2 September 2005 09:59:10 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Thursday, 19 August 2010 18:12:45 GMT