[Prev][Next][Index][Thread]
Re: What am I doing wrong here?
Matthew,
> Following is a tiny wrapper program that I wrote to use libwww to simply
> retrieve files, and send them out stdout. I find that it works with the
> first http URL I put in, but if I put in a second one, it core dumps. Here
> is the program...
The problem is that HTFWriter stream is freed when you have downloaded the
first document and then it will dump core on the next request. The general
rule for using the HTRequest structure is that it is a good idea only to use
it once, that is - generate a new one pr request. This way you are sure that
the structure is initiated as you like - and also that you more easily avoid
situations like this :-) In the Line Mode Browser, I have a list of request
structures which I have going and I always get a new structure for every new
request.
--
Henrik Frystyk frystyk@W3.org
World-Wide Web Consortium, Tel + 1 617 258 8143
MIT/LCS, NE43-356 Fax + 1 617 258 8682
77 Massachusetts Avenue
Cambridge MA 02154, USA
>
> #include "WWWLib.h"
> #include "stubs.h" /* Just a bunch of HText_* function stubs */
> #define MAX_URL_LEN 1024
>
> main(int argc, char *argv[])
> {
> char url[MAX_URL_LEN];
> HTRequest *request;
>
> /* Initialize the library */
> HTLibInit();
> WWW_TraceFlag = TRUE;
>
> /* Set up the static part of our request structure */
> request = HTRequest_new();
> request->output_format = WWW_SOURCE;
> request->output_stream = HTFWriter_new(stdout, TRUE);
>
> /* Loop until close of stdin, reading in URLs */
> while (gets(url))
> HTLoadAbsolute(url, request);
>
> HTLibTerminate();
> }
>
> Is there anything obviously wrong with it?