W3C home > Mailing lists > Public > www-lib@w3.org > April to June 2001

Re: Segm.Fault and memory problems

From: Michel Philip <mphilip@infovista.com>
Date: Wed, 13 Jun 2001 06:45:52 -0400 (EDT)
Message-ID: <F520B214418AD4119F7000508BD90CC294B8CB@hq01sr02.infovista.com>
To: "'www-lib@w3.org'" <www-lib@w3.org>
Hi Sven, (sorry for the draft answer)
Hi everybody, 

> I modified the example robot, so that it checks out a webpage
> in given intervalls (e.g. every 10 minutes). 
I work on a program doing the same thing.
How many different web pages are you polling?
> My first problem is, that the robot terminats after exactly 12h 
> with "segmentation fault (core dump)". On other pages it runs
> 24h and more.
> Could it be, that a firewall can kill the robot? Or is this a bug?
Is the firwall's name is Rambo? 
I rather thought of a bug or a bad use/bad modification.
> The second problem is, that the memory-use grows with every 
> intervall.
When periodically polling and http-parsing a web page
the main memory leak I got was because of Anchors.
Even if you call HTRequest_delete() the  Anchors are not deleted.
Anchors are globally shared between all the requests.
Then keeping Anchors is the right choice that allows to have
multiple running requests.

You can clear the Anchor by calling the HTAnchor_deleteAll
at any point where there is none running request.

> I can not find out where it happens, because I use the same 
> HyperDoc- and HText-Object after the first intervall.

"Using the same HyperDoc"...
How are you doing this? 
What functions do you call?
I believe your HyperDoc tree must be bigger at each interval
because links between anchors will be duplicated.

(see also my previous anwser to Attila :
http://lists.w3.org/Archives/Public/www-lib/2001AprJun/0056.html)
Michel.
Received on Wednesday, 13 June 2001 07:39:44 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 23 April 2007 18:18:39 GMT