W3C home > Mailing lists > Public > www-rdf-logic@w3.org > August 2001

RE: Summary of the QName to URI Mapping Problem

From: pat hayes <phayes@ai.uwf.edu>
Date: Tue, 28 Aug 2001 16:38:17 -0700
Message-Id: <v0421010db7b1d8f21d7d@[130.107.66.237]>
To: Patrick.Stickler@nokia.com
Cc: www-rdf-logic@w3.org
(Sorry about delay in responding.)

> > Again, I take your point, but I think you mis-state the case. Nobody
> > is saying that arbitrary XML Qname usage is expected to follow RDF
> > rules. But if someone sets out to use RDF, then why would it be
> > unreasonable to expect them to abide by the published RDF
> > conventions? Isn't that the point of publishing a spec, to "impose"
> > the conventions on anyone who chooses to use the system being
> > specified?
>
>But arbitrary QName usage (or rather namespace selection) is an
>unavoidable fact of the SW.
>
>Because the SW is supposed to be based on the syndication of knowledge
>from a broad range of disparate sources -- sources who's authors may
>not even realize where their knowledge is being used -- and therefore
>RDF cannot presume that every single namespace in every single case
>is going to be non-collisive with any other namespace for all names
>involved. You are naiively presuming a level of control and syncronization
>that does not and cannot exist on a global scale.

I beg to differ. I am not naively assuming anything of the kind, and 
I am not assuming anything about 'closed systems' either. My point 
concerns interactions between sources. Of course the SW must allow 
syndication of knowledge, just as the WWW places no restictions on 
what can or cannot be done with HTML. But just as the WWW could not 
function without some global assumptions about communication - 
suppose for example there was no HTML standard - and global 
assumptions about transfer protocols - eg HTTP , the SW will have to 
make use of some global, or at least globally accessible, assumptions 
about how content is to be encoded for transfer between agents. 
Software is not telepathic or magical: it needs to be written with 
certain specifications in mind. The whole point of a language like 
RDF is to provide such specifications.

>The core mechanisms of RDF *must* preserve the integrity of all data.

That is obviously impossible. Nothing, and certainly not RDF, can 
preserve the integrity of ALL data. There are probably still some 
COBOL files out there with data in them, for example. All that any 
RDF engine can possibly do is to preserve the integrity of all data 
*which are presented to it in legal RDF*.

>If RDF wishes to e.g. require that namespace URIs used for RDF
>serializations *must* end in a non-name character, fine. But so
>long as it is legal and possible for two sources to define qnames
>in total ignorance of one another which may collide and introduce
>ambiguity into the knowlege base, then this is an unnaceptable
>state of affairs.

Do you think it is unacceptable that two programs might be unable to 
communicate when one of them is using HTTP and the other is, in total 
ignorance of the other, using PPTP?

>Think global.  Think chaotic.  Think WWW.   Eh?

I think it is best for all concerned if I decline to respond to that.

>The present mapping function works fine for closed systems where
>all content is owned and controlled by a single authority, but
>that's *not* how the SW is supposed to work!

To publish a spec and invite anyone who wants to, to use it, is not 
'owned and controlled' by anyone, let alone a single authority. If 
you think the SW is going to work by the liberal use of pixie dust, 
then I had better sell my Nokia stock.

Pat Hayes

---------------------------------------------------------------------
(650)859 6569 w
(650)494 3973 h (until September)
phayes@ai.uwf.edu 
http://www.coginst.uwf.edu/~phayes
Received on Tuesday, 28 August 2001 19:37:12 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 7 December 2009 10:52:40 GMT