benchmark program for HTTP server's

[the original mail was lost during info.cern.ch downtime]
Hi,

i would like to profile my www server (or rather the CGI client),
and i'm looking around for a benchmark programs/robot which would
automate the process. Basicly, i'm looking around fore some software
which:

1. take a base URL
2. retrieves all URL in the base document, but
3. do not goes outside the server (e.g. restrict the set of
   allowed URL),
4. minimum time between HEADs/GETs,
5. runs under unix (preferable SunOS 4.1 - i have ported software
   to hp-ux/solaris 2.x/dec osf/4.3bsd/aix/ultrix/sgi/linux)

With all the HTTP severs around, I hope that some server author(s)
might have some (almost) ready-to-run software. I would write 
something quick and dirty, but i'm currently have some other priorities
(thesis writing). Of course, i will summarize to the list.

thank you,

msj
--
Martin Sj\"olin | http://www.ida.liu.se/labs/iislab/people/marsj
Department of Computer Science, LiTH, S-581 83 Link\"oping, SWEDEN 
phone : +46 13 28 24 10 | fax : +46 13 28 26 66 | e-mail: marsj@ida.liu.se 

Received on Wednesday, 1 March 1995 08:13:17 UTC