Re: Core dump when getting multiple/large files

At 12:15 7/27/98 -0700, Kimberly Doring wrote:

>I've encountered a strange segmentation fault when trying to FTP
>multiple files using the HTLoadToFile function.  It appears to occur
>immediately after loading a "large" file (i.e. in this particular case,
>a 16MB file).  The 16MB file loads successfully, but if I try to load
>another file immediately after this one, I get a segmentation fault.  
>
>In my code, I try to get 4 different files:
>
>pir1upd.5703
>pir2upd.5702
>pir3upd.5703
>pir4upd.5703
>
>I can successfully retrieve pir1upd.5703 and pir2upd.5703 (16MB file),
>but receive a segmentation fault when retrieving pir3upd.5703.

I had a go at your code and this is what I see: I do not get a segfault on
my system (solaris 2.6). Even when I ran purify, I didn't see any problem
whatsoever.

What is interesting is that I get "File not found" on pir3upd.5703.

It may be that the particular way the binary is packaged on your system
makes a difference. It could also be something that has been fixed in a
later patch. Are you using the checked out version from CVS? This is what I
just tried.

Hope this helps!

Henrik
--
Henrik Frystyk Nielsen,
World Wide Web Consortium
http://www.w3.org/People/Frystyk

Received on Wednesday, 29 July 1998 20:10:52 UTC