W3C home > Mailing lists > Public > www-lib@w3.org > April to June 2001

RE: multiple get requests to the same host

From: Jens Meggers <jens.meggers@firepad.com>
Date: Fri, 27 Apr 2001 19:11:51 -0700
Message-ID: <DDF913B74F07D411B22500B0D0206D9F4EF0B6@FIREPLUG>
To: "'Yoram Forscher'" <yoram@net2phone.com>, "'www-lib@w3.org'" <www-lib@w3.org>
please find attached my version of HTHost.c. It uses the define
MAX_QUEUED_REQUESTS to set the maximum queued requests for a single host.
Set it to 1 if you do not want to pipeline. Is case you are wrinting a robot
or something, you should know that this implementation does include some
sort of memory leak: If there are already more than MAX_QUEUED_REQUESTS in
the host pipe, it opens a new host object and allocates the memory for it.
The old host object is kept and is deallocted until the same host is
requested and HostTimeout = HOST_OBJECT_TTL expires. I actually had to fix a
bug in HTHost_new() for doing that. It might be of advantage, to change the
loop in HTHost() to also dellaocate if it is not the same host. It would be
necessary to change the while llop in HTHost_new() to:
    /* Search the cache */
 HTList * cur = list;
 HTHost* found = NULL;
 while ((pres = (HTHost *) HTList_nextObject(cur))) {
  if (HTHost_isIdle(pres) && (time(NULL)>pres->ntime+HostTimeout) &&
!pres->timer) {
   HTTRACE(CORE_TRACE, "Host info... Collecting host info %p\n" _ pres);
   // delete the current one
   delete_object(list, pres);
   // reset to the start of the list (only safe implementation for list
traversal with deletion)
   cur = list;
  } else {
      if (!strcmp(pres->hostname, host) && u_port == pres->u_port) {
    int count = HTHost_numberOfOutstandingNetObjects(pres) +
HTHost_numberOfPendingNetObjects (pres);
    if (count < maxQueuedRequests) {
     found = pres;
 pres = found;

This would at least clear all host object that are in the same hash list as
the current one, so that the memory usage for host objects is at least
Please let me know if it works for you,
-----Original Message-----
From: Yoram Forscher [mailto:yoram@net2phone.com]
Sent: Donnerstag, 26. April 2001 13:12
To: 'www-lib@w3.org'; 'jens.meggers@firepad.com'; >
Subject: Re: multiple get requests to the same host

I have the same experience, trying to send simultaneous GET requests to the
same host and getting the responses one at a time. When the requests are
sent to different hosts, they are really processed simultaneously. Is there
a way to get it to work in parallel? I understand from the previous message
that Jens Meggers has a version of HTHost.c that solves the problem. How can
I get this version? Is it becoming part of the "official" libwww?
Thanks -- YF

Received on Friday, 27 April 2001 22:23:24 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:33:54 UTC