Downloading web pages using threads

Hello, I am using libwww to download web pages. I have about 25 I want to download at around the same time. Right now I download one, when it finishes I start the other. I have add threading to speed this up but it seems that only the first thread goes through and all the others get terminated. It seems to me this is because they are all using the same memory space. If it were written using classes I don't see this would be a problem. Is there any special commands I have to give to download using threads? Below is the code I used in each thread.

int GetWebFile(CString url, CString outputFile, int num)
{
    int          status = 0; 
    HTRequest *         request = NULL;
 
    /* Initiate W3C Reference Library with a client profile */
    HTProfile_newNoCacheClient(APP_NAME, APP_VERSION);
 
    /* Need our own trace and print functions */
    HTPrint_setCallback(printer);
    HTTrace_setCallback(tracer);
 
#if 0
   HTSetTraceMessageMask("sop");
#endif
 
    /* Add our own filter to terminate the application */
    HTNet_addAfter(terminate_handler, NULL, NULL, HT_ALL, HT_FILTER_LAST);
 
    /* Setup cookies */
    HTCookie_init();

    HTCookie_setCallbacks(setCookie, NULL, findCookie, NULL);

    /* Set the timeout for long we are going to wait for a response */
    HTHost_setEventTimeout(5000);
 
     request = HTRequest_new();
 
     status = HTLoadToFile(url, request, outputFile);
  
     /* Go into the event loop... */
     HTEventList_loop(request);
  
    /* Delete our profile if no load */
    HTProfile_delete();
  
    return status;
}

Received on Tuesday, 4 April 2000 12:14:04 UTC