Re: [semanticweb] Purl.org offline?

Peter Ansell said the following on 2008-10-15 22:58:
> ----- "Reto Bachmann-Gm�r" <reto.bachmann@trialox.org> wrote:
>
>> From: "Reto Bachmann-Gm�r" <reto.bachmann@trialox.org> To:
>> "Giovanni Tummarello" <giovanni.tummarello@deri.org> Cc: "carmen r"
>> <_@whats-your.name>, "semantic-web at W3C" <semantic-web@w3.org>
>> Sent: Thursday, 16 October, 2008 6:33:07 AM GMT +10:00 Brisbane
>> Subject: Re: [semanticweb] Purl.org offline?
>>
>> Giovanni Tummarello said the following on 2008-10-15 20:49:
>>> Hey stop it there friend :-) this is basically like saying
>> resolvable
>>> URIs are bad, unthinkable
>>>
>> The weakness does not lie in the resolvability of the URIs but in
>> the centralized aspects of the DNS system. Fulfilling the design
>> principle of decentralization takes some effort but usually brings
>> significant benefits in terms of long-term scalability and
>> stability. I don't think the W3C is a club of hippie scientist by
>> adhering to that principle.
>>
>> The importance of the question how long knowledge expressed in
>> triples will still be understandable depends on the application,
>> but I don't think that it is irrelevant nor that the answer is
>> imperatively based on quantum entanglement.
>
> I think the answer to at least some of these questions might be in an
> RDF equivalent of the WayBackMachine [1], but with more than one
> provider instead of just archive.org who can cache entire RDF
> documents, detect when the documents change and provide different
> versions for the future.
I'm not talking about getting old triples but about interpreting the
terms used in such triples. Can we rely on http to deliver authoritative
definitions? do we have to find out the age of the triple and use
something like WayBackMachine (or the graph versioning system [1]) to
find the definition? And what if our code generates triples, should it
regularly check if the available definition is still the same as when
the code was written?
>
> The DNS system isn't broken, and for most levels of the DNS system
> there is massive redundancy. The lack of redundancy mostly appears at
> the last one or two levels in the heirarchy, where redundancy is
> still available, but not commonly implemented.
We are talking about different things here. I'm talking about the
centralized control of the system, when choosing a term based on a
domain name you have to trust ICANN or the world to change to ORSN [2]
fast enough when the situation requires it (that is, when you think the
change in the dns system leading to a new definition of your terms was
wrong), the samebtrust requirements apply for all hierarchy level of all
domain names used in your terms.
>
> Also, there are methods to ensure that casinos don't take over sites
> which have community support and brand recognition, so the casino
> example isn't exactly true, although letting registration lapse
> because the community hasn't supported something enough to give
> donations to keep the domain name stable. Everyone wants a great
> service for free, but inevitably something will be a limiting factor.
The take over by the casino site could have been avoided with little
resources. In other cases keeping a domain name could depend on
investing big amounts in layers [3]. In other cases it would be
virtually impossible (that .iq domain that you got before 2002 [4]).

However I think the stability of the term and the service delivering a
definition are two different things, I might pay for the latter but I
don't want to have to pay just so that assertions I made once don't
suddenly change their meaning.

> If you really want to trust some triples meet the author in person
> and exchange flash drives full of RDF and transcribe their GPG key
> onto a piece of paper. Then you have a full permanent copy of their
> triples (as at a particular date) and you can just internally repoint
> their particular domain name to your local cache for future use as
> long as you survive. (Of course this DNS redirection method doesn't
> work in the general case for purl.org but it will for domain-specific
> hosts)
There are easier ways to have a high level of certainty that triples
have in fact been asserted by an author. However requiring secure
systems to include all definitions of terms (presumably recurring to
natural language such as in skos:definition property) whenever triples
are exchanged, makes it hard to combine data from different source, and
I think there could be easier way to guarantee at least a relative
stability of the meaning of terms.

Cheers,
reto

1. http://gvs.hpl.hp.com/
2. http://en.wikipedia.org/wiki/Open_Root_Server_Network
3. Two years ago I was contacted by a layer asking to immediately cease
to use the term knobot and the domain knobot.org as "Knowbot" is
trademark of CNRI
4.
http://www.theinquirer.net/en/inquirer/news/2005/08/08/iraq-gets-internet-domain-name

Received on Thursday, 16 October 2008 07:58:10 UTC