W3C home > Mailing lists > Public > www-lib@w3.org > July to September 1998

Core dump when getting multiple/large files

From: Kimberly Doring <kimberly@biotools.com>
Date: Mon, 27 Jul 1998 12:15:35 -0700
Message-ID: <35BCD1D7.BAED0AEF@biotools.com>
To: www-lib@w3.org
Hi!

I've encountered a strange segmentation fault when trying to FTP
multiple files using the HTLoadToFile function.  It appears to occur
immediately after loading a "large" file (i.e. in this particular case,
a 16MB file).  The 16MB file loads successfully, but if I try to load
another file immediately after this one, I get a segmentation fault.  

In my code, I try to get 4 different files:

pir1upd.5703
pir2upd.5702
pir3upd.5703
pir4upd.5703

I can successfully retrieve pir1upd.5703 and pir2upd.5703 (16MB file),
but receive a segmentation fault when retrieving pir3upd.5703.


Here is my code:

#include "WWWLib.h"                           
#include "WWWMIME.h"                                
#include "WWWNews.h"                                  
#include "WWWHTTP.h"                                  
#include "WWWFTP.h"
#include "WWWFile.h"
#include "WWWGophe.h"
#include "WWWInit.h"

#define FILENAME_SIZE  128
#define STDERR_FILE    "/home/internal/log/FTPstderr"
#define SAVE_PATH      "/home/internal/dbupdates"
#define PIR            "ftp://nbrf.georgetown.edu/pir/updates"


int main (int argc, char ** argv)
{
    HTRequest * request;
    int i = 1, status = 0;
    char pirbuff[FILENAME_SIZE];
    char pirsavebuff[FILENAME_SIZE];

    memset(pirbuff, 0, FILENAME_SIZE);
    memset(pirsavebuff, 0, FILENAME_SIZE);

    /* freopen(STDERR_FILE, "w+", stderr); */

    HTProfile_newPreemptiveClient("TestApp", "1.0");

    for (i = 1; i < 5; i++) {
        sprintf(pirbuff, "%s/pir%dupd.5703", PIR, i);
        sprintf(pirsavebuff, "%s/pir%dupd.5703", SAVE_PATH, i);

        /* now get the file */
        fprintf(stderr, "Getting file %s\nSave to %s\n", pirbuff,
pirsavebuff);
        request = HTRequest_new();
        HTLoadToFile(pirbuff, request, pirsavebuff);
        HTRequest_delete(request);
    }

    HTProfile_delete();
    return 0;
}


My code was compiled using the following compile directives (using GNU
make):

FTPtest.o: FTPtest.c
        gcc -DHAVE_CONFIG_H -I. -I/usr/local/include/w3c-libwww -g -O2
-c FTPtest.c

FTPtest: FTPtest.o /usr/local/lib/libwww.a
        gcc -g -O2 -o FTPtest FTPtest.o /usr/local/lib/libwww.a -lm -ldl



Here is a trace from gdb:

Getting file ftp://nbrf.georgetown.edu/pir/updates/pir3upd.5703
Save to /home/internal/dbupdates/pir3upd.5703

Program received signal SIGSEGV, Segmentation fault.
0x8071659 in SendCommand (request=0x8095388, ctrl=0x809d768,
token=0x8081073 "REIN", pars=0x0)
    at HTFTP.c:386
HTFTP.c:386: No such file or directory.
(gdb) where
#0  0x8071659 in SendCommand (request=0x8095388, ctrl=0x809d768,
token=0x8081073 "REIN", pars=0x0)
    at HTFTP.c:386
#1  0x8071d4d in HTFTPLogin (request=0x8095388, cnet=0x80a59d0,
ctrl=0x809d768) at HTFTP.c:708
#2  0x8073266 in FTPEvent (soc=-1, pVoid=0x809d768, type=HTEvent_BEGIN)
at HTFTP.c:1520
#3  0x8072ea3 in HTLoadFTP (soc=-1, request=0x8095388) at HTFTP.c:1379
#4  0x8054944 in HTNet_newClient (request=0x8095388) at HTNet.c:732
#5  0x804caf4 in HTLoad (me=0x8095388, recursive=0 '\000') at
HTReqMan.c:1575
#6  0x804984f in launch_request (request=0x8095388, recursive=0 '\000')
at HTAccess.c:75
#7  0x8049883 in HTLoadAbsolute (
    url=0xbffffd38 "ftp://nbrf.georgetown.edu/pir/updates/pir3upd.5703",
request=0x8095388)
    at HTAccess.c:88
#8  0x80499c3 in HTLoadToFile (url=0xbffffd38
"ftp://nbrf.georgetown.edu/pir/updates/pir3upd.5703",
    request=0x8095388, filename=0xbffffcb8
"/home/internal/dbupdates/pir3upc.5703")
    at HTAccess.c:164
#9  0x80497e4 in main (argc=1, argv=0xbffffdd4) at FTPtest.c:27



Here is a trace from libwww (starting just after pir2upd.5703
successfully loaded):

Load End.... OK: `ftp://nbrf.georgetown.edu/pir/updates/pir2upd.5703'
Memory Free. 0x80a5a20
Request..... Delete 0x8089ee8
Response.... Delete 0x80a5b80
Memory Free. 0x80a5b80
Memory Free. 0x8089ee8
Getting file ftp://nbrf.georgetown.edu/pir/updates/pir3upd.5703
Save to /home/internal/dbupdates/pir3upd.5703
Request..... Created 0x8089ee8
Memory Free. 0x809d988
Memory Free. 0x80a5a20
Memory Free. 0x809d620
Memory Free. 0x809d660
HTSimplify.. `ftp://nbrf.georgetown.edu/pir/updates/pir3upd.5703' into
............ `ftp://nbrf.georgetown.edu/pir/updates/pir3upd.5703'
Find Parent. 0x808b4e0 with hash 555 and address
`ftp://nbrf.georgetown.edu/pir/updates/pir3upd.5703\
' created
HTAccess.... Accessing document
ftp://nbrf.georgetown.edu/pir/updates/pir3upd.5703
Memory Free. 0x808b570
Net Before.. calling 0x806e658 (request 0x8089ee8, context (nil))
Check rules. for `ftp://nbrf.georgetown.edu/pir/updates/pir3upd.5703'
Memory Free. 0x808b570
Net Before.. calling 0x806e5b0 (request 0x8089ee8, context (nil))
Memory Free. 0x8095358
Memory Free. 0x808b570
Memory Free. 0x809d780
Memory Free. 0x80a5a58
Net Object.. 0x809d780 created with hash 4
Net Object.. starting request 0x8089ee8 (retry=1) with net object
0x809d780
FTP......... Looking for `ftp://nbrf.georgetown.edu/pir/upd



Please note that I can also change the order of these files - i.e. I can
try to get pir2upd.5703 first, second, or third, but whatever I try to
retrieve immediately after this file results in a core dump.  If I make
pir2upd.5703 the last file to load, the program works successfully. 
This problem has happened with other files I have tried to load as well.

Sorry for such a long explanation, but any help/comments would be
greatly appreciated!

Kimberly
Received on Monday, 27 July 1998 14:16:06 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 23 April 2007 18:18:28 GMT