W3C home > Mailing lists > Public > www-lib@w3.org > July to September 1998

Re: Core dump when getting multiple/large files

From: Henrik Frystyk Nielsen <frystyk@w3.org>
Date: Wed, 29 Jul 1998 20:11:05 -0400
Message-Id: <3.0.5.32.19980729201105.009f68f0@localhost>
To: Kimberly Doring <kimberly@biotools.com>, www-lib@w3.org
At 12:15 7/27/98 -0700, Kimberly Doring wrote:

>I've encountered a strange segmentation fault when trying to FTP
>multiple files using the HTLoadToFile function.  It appears to occur
>immediately after loading a "large" file (i.e. in this particular case,
>a 16MB file).  The 16MB file loads successfully, but if I try to load
>another file immediately after this one, I get a segmentation fault.  
>
>In my code, I try to get 4 different files:
>
>pir1upd.5703
>pir2upd.5702
>pir3upd.5703
>pir4upd.5703
>
>I can successfully retrieve pir1upd.5703 and pir2upd.5703 (16MB file),
>but receive a segmentation fault when retrieving pir3upd.5703.

I had a go at your code and this is what I see: I do not get a segfault on
my system (solaris 2.6). Even when I ran purify, I didn't see any problem
whatsoever.

What is interesting is that I get "File not found" on pir3upd.5703.

It may be that the particular way the binary is packaged on your system
makes a difference. It could also be something that has been fixed in a
later patch. Are you using the checked out version from CVS? This is what I
just tried.

Hope this helps!

Henrik
--
Henrik Frystyk Nielsen,
World Wide Web Consortium
http://www.w3.org/People/Frystyk
Received on Wednesday, 29 July 1998 20:10:52 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 23 April 2007 18:18:28 GMT