RE: multiple get requests to the same host

Hi,

I put the code change into the version that I have (not the whole file, just
the code you included in the message) and I set MAX_QUEUED_REQUESTS to 1 and
it worked. I could send multiple requests to the same host and see the
results coming back in parallel. Thanks for your help.

Yoram

-----Original Message-----
From: Jens Meggers [mailto:jens.meggers@firepad.com]
Sent: Friday, April 27, 2001 10:12 PM
To: Yoram Forscher; 'www-lib@w3.org'
Subject: RE: multiple get requests to the same host

Hi,
 
please find attached my version of HTHost.c. It uses the define
MAX_QUEUED_REQUESTS to set the maximum queued requests for a single host.
Set it to 1 if you do not want to pipeline. Is case you are wrinting a robot
or something, you should know that this implementation does include some
sort of memory leak: If there are already more than MAX_QUEUED_REQUESTS in
the host pipe, it opens a new host object and allocates the memory for it.
The old host object is kept and is deallocted until the same host is
requested and HostTimeout = HOST_OBJECT_TTL expires. I actually had to fix a
bug in HTHost_new() for doing that. It might be of advantage, to change the
loop in HTHost() to also dellaocate if it is not the same host. It would be
necessary to change the while llop in HTHost_new() to:
 
 
    /* Search the cache */
    {
 
 HTList * cur = list;
 HTHost* found = NULL;
 while ((pres = (HTHost *) HTList_nextObject(cur))) {
 
  if (HTHost_isIdle(pres) && (time(NULL)>pres->ntime+HostTimeout) &&
!pres->timer) {
 
   HTTRACE(CORE_TRACE, "Host info... Collecting host info %p\n" _ pres);
 
   // delete the current one
   delete_object(list, pres);
 
   // reset to the start of the list (only safe implementation for list
traversal with deletion)
   cur = list;
 
  } else {
 
      if (!strcmp(pres->hostname, host) && u_port == pres->u_port) {
 
    int count = HTHost_numberOfOutstandingNetObjects(pres) +
HTHost_numberOfPendingNetObjects (pres);
 
    if (count < maxQueuedRequests) {
 
     found = pres;
 
    }
 
   }
 
     }
 
 }
 
 pres = found;
    }
}

 
This would at least clear all host object that are in the same hash list as
the current one, so that the memory usage for host objects is at least
bounded.
 
Please let me know if it works for you,
 
 
Jens
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
-----Original Message-----
From: Yoram Forscher [mailto:yoram@net2phone.com]
Sent: Donnerstag, 26. April 2001 13:12
To: 'www-lib@w3.org'; 'jens.meggers@firepad.com'; >
Subject: Re: multiple get requests to the same host


I have the same experience, trying to send simultaneous GET requests to the
same host and getting the responses one at a time. When the requests are
sent to different hosts, they are really processed simultaneously. Is there
a way to get it to work in parallel? I understand from the previous message
that Jens Meggers has a version of HTHost.c that solves the problem. How can
I get this version? Is it becoming part of the "official" libwww?
 
Thanks -- YF
 
 

Received on Wednesday, 2 May 2001 08:40:18 UTC