W3C home > Mailing lists > Public > ietf-http-wg-old@w3.org > May to August 1995

Re: HTTP Session Extension draft

From: Daniel DuBois <ddubois@spyglass.com>
Date: Mon, 10 Jul 95 17:13:38 -0500
Message-Id: <9507102213.AA11655@hook.spyglass.com>
To: http-wg%cuckoo.hpl.hp.com@hplb.hpl.hp.com
>If retrieval is truly a stateless process, then the probability of a an 
>object being requested will be independent of any previous object 
>requests. Any perusal of activity logs will show that this is clearly not 
>the case- if a page with links is requested, any links on that page will 
>have a higher probability of being requested than if the page that links 
>to them had not been fetched.

While this may be true, any attempts on the part of the server to attempt to
*guess* at what a client may do next would require alot of processing and
would open a huge can of worms I know I'd rather avoid.  Servers at popular
sites today have enough trouble keeping up with simple stateless requests.
I'd hate to see the effect of attempts at making them remember user
information about every one who comes in the door.

It's clear that we need persistent connections because of inline images.
But that doesn't mean we need statefulness.

Now that I 'shared' about statefulness, I may as well speak my mind on
persistent connections.  I'm not yet convinced persistent connections for
anything other than a client who can knows for certain they will be
requesting multiple documents is a good thing.

CASE 1 -- Have grabbed "initial.html".  Doc has 5 inline images.  User's
indicated inline images are to be grabbed.  My opinion: Making the next
request with a persistent connection is definitely a 'good thing'.
CASE 2 --  Want to grab "initial.html".  Client yet has no clue whether
document contains inline images.  My opinion: I question whether making a
persistent connection here is valid, especially if the user has requested
inline images not to be sent.
CASE 3 -- Want to grab "initial.html".  User doesn't load inline images, but
client guesses that the user will be so enamored with this document that
they might want to click on another document from the same server.  My
opinion: I think the client is out of line holding a connection open on a
server solely for this reason.  Doesn't an extra connection mean an extra
process and an extra copy of the server program in memory for many servers?
What a waste!  Now people who are making legitimate connections with real
requests get slower performance while the server swaps memory to disk.
Maybe next rude clients will hold open connections on documents 3 layers
deep in their history, just on the off chance a user might go back to it and
start clicking again.

OK, so a server can close down at any time.  Now we place an extra burden on
the server to make heuristics and decisions about when to dump, how long is
long enough, grep documents on "<IMG", etc.

Again, the above are just my opinion, and my opinions can change given good
arguements.  Let me hear them.
Dan DuBois, Software Animal                          ddubois@spyglass.com
(708) 505-1010 x532                     http://www.spyglass.com/~ddubois/
Received on Monday, 10 July 1995 15:14:36 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 7 January 2015 14:40:14 UTC