Robots & Persistent connexion, pipelining

Hi,

I'd like to know what do you think about the use of persistent connexions
(HTTP/1.1) and pipelining by robots.
The persistent connexions were introduces to lower the web and server 
charge (TCP connexion charge). But their use by a robot conflicts with the
desiderate that the robot should not overload the server by rapid-fire.
For example, a robot should not ask a server more often than once
every 30secons for a page. Of course, this statement excludes the use of
persistent con. 
But, if I want to get 10 pages from a server, wouldn't it
be more advantageous (for the server too) to get the 10pages at once (with
persistent c.) and afterwards leave it in peace for 300sec(30*10) rather
than open a TCP con. and get a page every 30sec, 10 times?

First, I'd like to know what is the consensus in this matter.
Second, I'd like to know what do you think about this proposition, which
is aimed to allow the use of persistent c. by robots:

So, we are againt server-overload, and we allocate 30sec/page. But we
allow to get a limitted number of pages (say, 5 or 10) together through a
persistent con., with the condition to leave the server rest longer
afterwards. If we get 5 pages at once, we won't ask anything from the same
server for the next 30*5 seconds. What do you think?

Regards,
Mihai

Received on Tuesday, 22 August 2000 08:05:08 UTC