RE: Downloading web pages using threads

How do you implement the multiple simultaneous socket connections to a
single host.

Thanks,
Clint

Clint B. Suson           408-986-8988 X-255
Network Software Engr.
GWCom Inc.

> -----Original Message-----
> From: www-lib-request@w3.org [mailto:www-lib-request@w3.org]On Behalf Of
> Sam Couter
> Sent: Tuesday, April 04, 2000 4:17 PM
> To: Phil Inglis
> Cc: www-lib@w3.org
> Subject: Re: Downloading web pages using threads
>
>
> Phil Inglis <inglisphil@home.com> wrote:
> > Hello, I am using libwww to download web pages. I have about 25
> I want to download at around the same time. Right now I download
> one, when it finishes I start the other. I have add threading to
> speed this up but it seems that only the first thread goes
> through and all the others get terminated. It seems to me this is
> because they are all using the same memory space. If it were
> written using classes I don't see this would be a problem. Is
> there any special commands I have to give to download using
> threads? Below is the code I used in each thread.
> [ code snipped ]
>
> libwww is *NOT* threadsafe. Don't try to use it from more than one thread.
> I think you're lucky that even the first one was completed. :)
>
> One solution you can use is:
>
> HTProfile_newFooProfile();
> HTNet_addAfter(terminate_handler, NULL, NULL, HT_ALL, HT_FILTER_LAST);
> while (there_are_more_files_to_get) {
> 	request = HTRequest_new();
> 	status = HTLoadToFile(url, request, outputfile);
> }
> HTEventList_newLoop();
> HTProfile_delete();
>
> This will go and get all the files you asked for, using what libwww calls
> "psuedothreads". It will also use funky features like pipelining and
> multiple simultaneous sockets (one for each host) to increase performance.
> --
> Sam Couter                                              sam@topic.com.au
> Internet Engineer                               http://www.topic.com.au/
> tSA Consulting
>

Received on Tuesday, 4 April 2000 20:17:10 UTC