real time problem with libwww5.2.8

hi all,

I face a Real-Time problem that occurs only when I crawl an intranet
(obviously faster than the Web).

the facts:

I am developing a web robot  with libwww5.2.8 on solaris 2.7.

I use the event loop as shown in the w3c sample webbot (request/response
are processed asynchronously as they come).

Requests are sent in non preemptive mode.

To increase efficiency I load the request manager  (HTLoadAnchor) with
some requests (about 50 ) built thanks to a list of url .
At that step I still haven't enabled the event loop
(HTEventList_newLoop()). Requests should be kept in the request manager
until the number of data exceeds the limit or the timout expires .

but ,  It looks like requests are issued immediatly by the request
manager as soon as it handles them and the core has switched to another
thread.  In this case I never executes codes after the the first
HTLoadAnchor()  as the response is already received. Thus remaining url
in the list are not processed and my robot stops prematurely.
This is why I think it is a RT probleme.

I have this problem on our intranet when machines are not very busy .
Never on the web.

I would like to know if any one already faced it, or has found a
workaround or if I am wrong in my analysis.

Thanks a lot.

Francois Nicot.








I

Received on Tuesday, 4 July 2000 08:40:31 UTC