- From: Sven Laaks <slaaks@informatik.uni-essen.de>
- Date: Thu, 07 Jun 2001 18:09:00 +0200
- To: www-lib@w3.org
Hi everybody, I modified the example robot, so that it checks out a webpage in given intervalls (e.g. every 10 minutes). My first problem is, that the robot terminats after exactly 12h with "segmentation fault (core dump)". On other pages it runs 24h and more. Could it be, that a firewall can kill the robot? Or is this a bug? The second problem is, that the memory-use grows with every intervall. I can not find out where it happens, because I use the same HyperDoc- and HText-Object after the first intervall. Does anyone know a solution? Thanks Sven Laaks
Received on Thursday, 7 June 2001 12:16:19 UTC