W3C home > Mailing lists > Public > www-lib@w3.org > April to June 1999

Re: Handling Errors and Interruptions

From: Henrik Frystyk Nielsen <frystyk@w3.org>
Date: Thu, 20 May 1999 14:58:07 -0400
To: www-lib@w3.org
Message-ID: <37445B3F.DA53A258@w3.org>
To: Michael Saunders <michael@amtec.com>
Michael Saunders wrote:


> I am encountering three problems that may be related:
> 
>         1) While a request is in the process of downloading or uploading
>            a file how does one stop the transfer? I have attempted all
>            variations of trying to kill the request from the progress
>            alert dialog and from a timer with no effect, the transfer
>            just keeps on going. How do you cleanly stop a request that
>            is in progress?

There are two ways to prematurely kill a request: using timeouts or to
explicitly kill the request. The former can be set using the various
timers that you can find at

	http://www.w3.org/Library/User/Using/Timers.html

The other can be used by callin the kill method on the request:

	http://www.w3.org/Library/src/HTReq.html#Killing

>         2) This utility I created is only a test for a larger project.
>            What I want to do is have my main application's X event loop
>            separate from the libwww event loop (mostly because I don't
>            understand how to integrate to two yet). Therefore after a
>            download request completion I want to exit the loop and
>            return to my main application's X event loop. To do this I
>            put a "if (HTNet_isIdle()) HTEventList_stopLoop();" construct
>            in the HT_PROG_DONE section of the progress alert dialog.
>            This works fine for a successful download but not for a
>            failed one (lets say a bad host name is specified). I tried
>            putting the same construct after the printing the "bad host
>            error message" but it indicates that it is not idle, i.e.
>            requests are still pending.

When the request is done (regardless of the status code) the Net object
should be deleted in HTNet_delete() and the net object counter should be
decreased. Try and run the program with protocol and core traces and you
should be able to see what is going on, see

	http://www.w3.org/Library/src/HTHome.html#Trace

for how to turn on these traces.

>         3) Certain poorly formed URLs will cause the select() to block.
>            Even the 'LoadToFile' application will hang on the select()
>            call (because the last argument to select(), a pointer to
>            timeval, is mysteriously NULL for certain cases of a poorly
>            formed URLs). You can duplicate the problem by giving the
>            LoadToFile example a URL to a known ftp server but to a
>            file that does not exist. How do you break out of these
>            errors? You can also cause 'LoadToFile' to hang if
>            you enter a bad target file by incorrectly specifying
>            a "file:" prefix:
>                 LoadToFile ftp://myHost/myFile.txt -o file:/tmp/myFile.txt

I think there has been a bit of bit-rot in the FTP module because of the
changes in the way we handle host and net objects. That is why it
doesn't handle accepting connections for the moment.
 
> Can anyone get me pointed in the correct direction? I have been pulling my hair
> out try to solve the problems. I like the architecture of the library and would
> like to use it but I have to solve these problems to make it acceptable for the
> end user.

-- 
Henrik Frystyk Nielsen, <frystyk@w3.org>
World Wide Web Consortium, MIT/LCS NE43-356
545 Technology Square, Cambridge MA 02139, USA
Received on Thursday, 20 May 1999 14:58:13 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 23 April 2007 18:18:29 GMT