W3C home > Mailing lists > Public > www-lib@w3.org > July to September 1998

Re: Core dump when getting multiple/large files

From: Henrik Frystyk Nielsen <frystyk@w3.org>
Date: Wed, 29 Jul 1998 20:11:05 -0400
Message-Id: <>
To: Kimberly Doring <kimberly@biotools.com>, www-lib@w3.org
At 12:15 7/27/98 -0700, Kimberly Doring wrote:

>I've encountered a strange segmentation fault when trying to FTP
>multiple files using the HTLoadToFile function.  It appears to occur
>immediately after loading a "large" file (i.e. in this particular case,
>a 16MB file).  The 16MB file loads successfully, but if I try to load
>another file immediately after this one, I get a segmentation fault.  
>In my code, I try to get 4 different files:
>I can successfully retrieve pir1upd.5703 and pir2upd.5703 (16MB file),
>but receive a segmentation fault when retrieving pir3upd.5703.

I had a go at your code and this is what I see: I do not get a segfault on
my system (solaris 2.6). Even when I ran purify, I didn't see any problem

What is interesting is that I get "File not found" on pir3upd.5703.

It may be that the particular way the binary is packaged on your system
makes a difference. It could also be something that has been fixed in a
later patch. Are you using the checked out version from CVS? This is what I
just tried.

Hope this helps!

Henrik Frystyk Nielsen,
World Wide Web Consortium
Received on Wednesday, 29 July 1998 20:10:52 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:33:48 UTC