W3C home > Mailing lists > Public > www-lib@w3.org > April to June 2000

RE: huge pages

From: Yovav Meydad <yovavm@contact.com>
Date: Wed, 17 May 2000 18:25:16 +0200
To: "'Ashraf Mohamed'" <ashraf.mohamed@spanlink.com>, <www-lib@w3.org>
Message-ID: <002301bfc01c$7cf791c0$1c44a8c0@yovavm>
You can implement a progress filter, in which you can calculate the amount
of data received so far. At this point you can perform the interruption.
See the default progress handler as reference - HTDialog_progressMessage at
HTDialog.c
Do not forget to add your handler when initializing the library using :
HTAlert_add(MyProgressFilter, HT_A_PROGRESS);

Yovav

-----Original Message-----
From: www-lib-request@w3.org [mailto:www-lib-request@w3.org]On Behalf Of
Ashraf Mohamed
Sent: Wednesday, May 17, 2000 5:11 PM
To: www-lib@w3.org
Subject: huge pages


hi

I am trying to access a page that has thousands of lines of text/html and it
is taking a very long time to retrieve the data to the extent that it hogs
up my box.

I need to interrupt/stop the retrieval after a certain amount of time. Could
anyone suggest a simple way of doing this? Are there any libwww functions
that could take care of this?

I was thinking of setting up an alarm before I make my Get/Post request.

btw I am on Unixware-7.

Thanx
Ash
Received on Wednesday, 17 May 2000 11:28:37 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 23 April 2007 18:18:36 GMT