W3C home > Mailing lists > Public > public-qa-dev@w3.org > September 2005

Re: Making Linkchecker working parallel

From: olivier Thereaux <ot@w3.org>
Date: Mon, 12 Sep 2005 10:10:42 +0900
Message-Id: <45747EB5-8890-4FE0-876C-506F7F7D0A94@w3.org>
Cc: QA Dev <public-qa-dev@w3.org>
To: Dominique HazaŽl-Massieux <dom@w3.org>


On 5 Sep 2005, at 20:04, Dominique HazaŽl-Massieux wrote:
>
> I got it to work by adding                 $res->request($request);
> after line 465 of RobotUA.pm;

I have contacted Marc Langheinrich about this, proposing him that  
patch. However, it seems we should not expect an answer before the  
end of the month (September 2005).

> that said, I don't know if this is a bug
> of Parallel::RobotUA or a too strong assumption on our side that a
> response necessarily has an attached request OR a previous request
> attached. Maybe we have to, I don't know.

It's an assumption on our side... because there's too little  
documentation anyway.

I think it's a fair assumption, and I can imagine no-one noticed the  
crippled response object because the typical spider/crawler built  
upon that library probably has no post-processing to be done on a  
403'd resource...

-- 
olivier
Received on Monday, 12 September 2005 01:10:49 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Thursday, 19 August 2010 18:12:45 GMT