Re: About Webbot and other.

fiorenti@cli.di.unipi.it writes:

>         I have some doubts about Webbot's behavior and about converters.
>         1. If I run the Webbot on local file system ( I don't know what
> happen in network... ), I have a Segmentation Fault. Perhaps, does this
> error occur in consequence of my system configuration ( Linux 1.3.20, gcc
> 1.2.7 ) ? I tried to fix this error using gdb, but ..... I failed. Can you
> help me ?

What version are you using? Do you have tried to trace the error? Where does 
it dump core? 
         
>         2. When in a request I set up the output format to www/present, and
> the request refers to a text/plain content type document, ( if the library
> is well configured.. ) some converters turn on to fulfill the request. Now,
> when using the HText interface, wherefore aren't the HText_endAppend and
> HText_free call, when parsing the document? Perhaps, whether this is a
> mistake or there is a particular reason for this behavior. Can you help me ?    
You are right that this is missing. You can add the following line to the 
HTPlain_free  function in HTPlain.c:

	HText_endAppend(me->text);

However, it shouln't call it in the flush method (even though it's done in the 
HTML.c module which is also a bug). The reason is that flush doesn't have to 
finish the stream. Only free and abort terminates the stream.  Also, it 
shouldn't free the text object. As Maciej writes, it is for the application to 
free the text object as it may want to keep it around for some time as a 
memory cache. Likewise the HTWriter stream's flush method doesn't close the 
socket - it just flushes any buffers.

>         3. Why doesn't the timeout handler return the right request object ?
> When will this error fix up? Can you help me ?

The reason is that this is not necessary when having as user being able to hit 
the abort when he or she wants to. Netscape has exactly the same behavior. 
However, in the case of a robot or a server this is not the case and you want 
better control of which requests that have timed out. 

There was a mail about this on the list some time ago. Has anybody had time to 
look into that?

-- 

Henrik Frystyk Nielsen, <frystyk@w3.org>
World-Wide Web Consortium, MIT/LCS NE43-356
545 Technology Square, Cambridge MA 02139, USA

Received on Wednesday, 7 February 1996 11:31:50 UTC