W3C home > Mailing lists > Public > www-talk@w3.org > July to August 2000

Re: Robots & Persistent connexion, pipelining

From: Nic Ferrier <nferrier@tapsellferrier.co.uk>
Date: Tue, 22 Aug 2000 13:37:52 +0100
Message-Id: <s9a28229.030@tapsellferrier.co.uk>
To: Mihai.Preda@inria.fr, www-server@w3.org, www-talk@w3.org
>>> Mihai Preda <Mihai.Preda@inria.fr> 22-Aug-00 1:05:14 PM >>>

>But, if I want to get 10 pages from a server, wouldn't it
>be more advantageous (for the server too) to get the 10pages 
>at once (with persistent c.) and afterwards leave it in peace for 
>300sec(30*10) rather than open a TCP con. and get a page 
>every 30sec, 10 times?

Well... there is still the question of processor and disk i/o util

>First, I'd like to know what is the consensus in this matter.

I think persistent connections are all right for robots. 

Trouble is you can't gaurantee you're going to get one can you?

>Second, I'd like to know what do you think about this proposition,
> which is aimed to allow the use of persistent c. by robots:
>So, we are againt server-overload, and we allocate 30sec/page. 
>But we allow to get a limitted number of pages (say, 5 or 10)
>through a persistent con., with the condition to leave the server
>longer afterwards. If we get 5 pages at once, we won't ask anything

>from the same server for the next 30*5 seconds. What do you think?

I don't think that's necessary myself... if you get a persistent
connection you should be able to get 5-10 pages and then drop for 30

Still... it may be worth doing... it would take a long time for rule
changes to percolate through to the real net (as it were).

Received on Tuesday, 22 August 2000 08:32:25 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:33:02 UTC