Re: multi-host virtual sites for HTTP 1.2

> I'd like to suggest an elaboration, to:
> 
> 	Referral	= "Referral" ":" prefix 1#referral-info
> 	prefix  	= absoluteURI
> 	referral-info	= referral-URI delta-seconds metric
> 	referral-URI	= absoluteURI
> 	metric		= 1*DIGIT
> 
> Thus, a referral would include not only a new URI, but a maximum
> age ("time-to-live" in DNS terms) and a cost metric.  The client
> algorithm would then be
> 	choose from the unexpired referral-infos (those whose
> 	delta-seconds is less than the Age of the response)
> 	the referral-URI with the lowest metric.
> Or maybe that should be "highest metric"?  I dunno.
> 
> We could perhaps get rid of the expiration (delta-seconds) parameter if
> we made it explicit that a referral lasts only as long as the
> Cache-control: max-age of the response that carries it.  But
> you need an expiration mechanism of some sort, or else these bindings
> are impossible to revoke.
> 
> We might want to modify that client algorithm so that "unexpired
> referral-info" includes "which has not been unresponsive recently".
> I.e., if a server is playing hard-to-GET, drop it from the list.

Something like this seems like a reasonable idea. This does wander over
into the ground that URNs hoped to cover, but it makes the server responsible
for distributing it's own replication information, and so avoids the
wider infastructure questions.

I think the most important thing if something like this is to be useful
is to have an explicit model of the semantics that mandates some things,
suggests others, and explicitly leaves others up to implementations. If people
start with a vauge model and apply lots of huristics, interoperability
could suffer.

The gopher developers tried to introduce server replication at one point, but
it was not widely implemented, perhaps because the number of servers
seeing single-server overload was smaller. (Though the need for server
replication has been evident since the onset of archie, and people have
been trying to offer better solutions.)

In the case of a widely geographically distributed resource (like say
info-mac or rfc mirror sites), a cost metric could only serve as
a first approximation to what a client (or proxy) might determine
emperically were the actual network costs. Though, like MX record
priorities, it might still represent a server-side preference
among alternatives that seem to have equal utility to the client.

(Which makes it important to define whose "cost" is in the model.)

I have visions on multinomal logit choice models dancing in my head,
but I suppose that gets to be a bit much for a poor web client
to deal with on a per-site basis....

But it does have one property I like: I'd say rather than just say
"choose the referal info with the best cost metric" I'd like
to say that clients should choose in a quasi-random way among
options with an equal cost metric.  (What they do in the case
of unequal costs needs to be refined further to define what
this would mean.)

-- 
    Albert Lunde                      Albert-Lunde@nwu.edu

Received on Wednesday, 10 July 1996 18:59:28 UTC