W3C home > Mailing lists > Public > www-lib@w3.org > July to September 1998

Re: Core dump when getting multiple/large files

From: Henrik Frystyk Nielsen <frystyk@w3.org>
Date: Mon, 27 Jul 1998 20:24:15 -0400
Message-Id: <>
To: Kimberly Doring <kimberly@biotools.com>, www-lib@w3.org
At 12:15 7/27/98 -0700, Kimberly Doring wrote:

>I've encountered a strange segmentation fault when trying to FTP
>multiple files using the HTLoadToFile function.  It appears to occur
>immediately after loading a "large" file (i.e. in this particular case,
>a 16MB file).  The 16MB file loads successfully, but if I try to load
>another file immediately after this one, I get a segmentation fault.  

There have been several people submitting patches to the FTP module lately
- anyone care to help finding this problem?


Henrik Frystyk Nielsen,
World Wide Web Consortium
Received on Monday, 27 July 1998 20:24:00 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:33:48 UTC