W3C home > Mailing lists > Public > www-lib@w3.org > July to September 1998

Re: Core dump when getting multiple/large files

From: Henrik Frystyk Nielsen <frystyk@w3.org>
Date: Mon, 27 Jul 1998 20:24:15 -0400
Message-Id: <3.0.5.32.19980727202415.00ae2ae0@localhost>
To: Kimberly Doring <kimberly@biotools.com>, www-lib@w3.org
At 12:15 7/27/98 -0700, Kimberly Doring wrote:

>I've encountered a strange segmentation fault when trying to FTP
>multiple files using the HTLoadToFile function.  It appears to occur
>immediately after loading a "large" file (i.e. in this particular case,
>a 16MB file).  The 16MB file loads successfully, but if I try to load
>another file immediately after this one, I get a segmentation fault.  

There have been several people submitting patches to the FTP module lately
- anyone care to help finding this problem?

Thanks!

Henrik
--
Henrik Frystyk Nielsen,
World Wide Web Consortium
http://www.w3.org/People/Frystyk
Received on Monday, 27 July 1998 20:24:00 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 23 April 2007 18:18:28 GMT