Re: Segm.Fault and memory problems

Hi Sven, (sorry for the draft answer)
Hi everybody, 

> I modified the example robot, so that it checks out a webpage
> in given intervalls (e.g. every 10 minutes). 
I work on a program doing the same thing.
How many different web pages are you polling?
> My first problem is, that the robot terminats after exactly 12h 
> with "segmentation fault (core dump)". On other pages it runs
> 24h and more.
> Could it be, that a firewall can kill the robot? Or is this a bug?
Is the firwall's name is Rambo? 
I rather thought of a bug or a bad use/bad modification.
> The second problem is, that the memory-use grows with every 
> intervall.
When periodically polling and http-parsing a web page
the main memory leak I got was because of Anchors.
Even if you call HTRequest_delete() the  Anchors are not deleted.
Anchors are globally shared between all the requests.
Then keeping Anchors is the right choice that allows to have
multiple running requests.

You can clear the Anchor by calling the HTAnchor_deleteAll
at any point where there is none running request.

> I can not find out where it happens, because I use the same 
> HyperDoc- and HText-Object after the first intervall.

"Using the same HyperDoc"...
How are you doing this? 
What functions do you call?
I believe your HyperDoc tree must be bigger at each interval
because links between anchors will be duplicated.

(see also my previous anwser to Attila :
http://lists.w3.org/Archives/Public/www-lib/2001AprJun/0056.html)
Michel.

Received on Wednesday, 13 June 2001 07:39:44 UTC