Re: Globalizing URIs

Martin J Duerst (
Thu, 17 Aug 1995 22:27:21 +0200 (MET DST)

Message-Id: <>
Subject: Re: Globalizing URIs
To: (Keith Moore)
Date: Thu, 17 Aug 1995 22:27:21 +0200 (MET DST)
In-Reply-To: <> from "Keith Moore" at Aug 17, 95 06:36:17 am
From: Martin J Duerst <>

>> >That is, a user ought to be able to know whether a link is likely to
>> >break before he puts it in his hotlist.
>> Persistence is not just a yes/no decision. 
>I agree with this.  But I've been thinking lately about what it takes
>to make a document id persistent for the long term, and have concluded
>that it requires some prior planning (where do you put it so that it
>will continue to be accessible) and some committment to providing
>the necessary resources.  Also, sometimes an author knows that a 
>document is going to be revised many times, and thus references to
>(say) the third chapter of that document aren't likely to be useful
>for long, because the third chapter will sooner or later be something
>completely different than it originally was.
>I'm not sure how best to indicate this, and am interested in seeing
>others' ideas on the topic.

So it is more the reference than the document itself that you want
to keep persistent? E.g. you would accept that for some reason a
document disappears completely, but you would like to be able
to find the document as long as it exists, and for that you have to
guarantee that the reference has eternal life?

But then, if it will turn out that I will have to pay somebody to
guarantee that the reference doesn't disappear, couldn't I
just pay them for putting my document on one of their servers
for eternity or as long as I don't recall it? E.g. if I want to be
sure that my home page can get accessed even if I get fired
at the place where I currently are, it would already now be
possible for an independent company to offer such a service,
and they could use normal URLs, such as, with xxxxx possibly
being a meaningful text depending on customer preference.
The only odds in this scheme are
	1) The possibilty the company gets broken
		(which might be alleviated by some "official" internet
		guarantee, or an appropriate insurance)
	2) The internet deciding to reshuffle domain names or taking
		protocols out of service.
	3) Some higher force such as legal action or the end of the world.
What is interesting is that there is no need to have any new schemes,
and that it may be possible that such a company already exists
(although I think that usual internet service providers have such hazards
as reshuffling their directory structure among their disclaimers).
Also, it indicates that no further work may be necessary, and that if such
a business opportunity is not taken, it just means that there is not enough
demand for it, which probably can't be changed by a better URx scheme.

I don't know how much the internet community values such economic
arguments, but the impression we have of the US here in Europe is
that such arguments usually make their mark :-).

>> Most of the "missing link" problems are due to a) the things really becomming
>> obsolete and b) the initial errors and slopyness of server administrators
>> and document writers that take their time to realize that they better
>> had to think twice before deciding on/changing a domain name or the
>> location and name of a doument. 
>I don't disagree with you, but are you sure?  I haven't seen any 
>measurements on it, and this strikes me as the sort of case where
>one's intuition is likely to be wrong.  Are there any studies on
>why links go bad?

I don't have any, but would be interested if there are, too.
But what would you think are the other reasons for "missing links"?

Regards,	Martin.