W3C home > Mailing lists > Public > www-rdf-interest@w3.org > February 2004

Re: URI: Name or Network Location?

From: Patrick Stickler <patrick.stickler@nokia.com>
Date: Thu, 19 Feb 2004 10:09:19 +0200
Message-Id: <EAFF20D0-62B2-11D8-BAE4-000A95EAFCEA@nokia.com>
Cc: "Rhoads, Stephen" <SRhoads@ThruPoint.net>, "'www-rdf-interest@w3.org'" <www-rdf-interest@w3.org>
To: "ext Benjamin Nowack" <bnowack@appmosphere.com>

On Feb 18, 2004, at 22:26, ext Benjamin Nowack wrote:

> another problem is indeed the unambiguous distinction between online 
> resources
> and their rdf descriptions (if you don't know what you get back when 
> you are
> dereferencing the URI).
>> So maybe we *don't* fall back to returning an RDF description and all 
>> I get
>> is 404.  And so I go ahead and issue an MGET to the server for the 
>> URI and
>> get back an RDF description.  That I *do* like, because I am 
>> essentially
>> saying "give me the metadata you have for this URI" rather than "give 
>> me a
>> representation of this URI".
> yes, URIQA seems to be a well-elaborated system for explicitly saying 
> "give
> me the semantics". (unfortunately it's too proprietary for me. I'm 
> currently
> using a combination of http-accept, rdf autodiscovery and querystring
> extensions, which is non-standard as well but can be implemented with 
> existing
> tools and works for me so far.)

I wouldn't necessarily say that new methods are "proprietary", as you 
easily implement support for new methods based on the open standards 
as you can implement support for new headers and query string 

Rather, and admitedly, it is simply a bit more work to implement support
for new methods than for headers/parameters.

I'm not religiously devoted to new methods. It would be great if the
necessary behavior for URIQA could be captured without new methods (and
I've tried several different approaches) but unfortunately, the new
methods are the only means I've found to arrive at a solid, robust

The biggest issue is knowing when a URIQA request has not been 
by a server. If you just use headers, there is no way to ensure that 
headers have been understood by the server *before* the server attempts
to provide some response (yes, there is an ID that tries to introduce
manditory headers into HTTP, but it's vaporware).

An earlier incarnation of URIQA used a header, which was also required
to be included in the response, so that the client could check that the
original request was understood. Unfortunately, it doesn't scale. This
is because a given URI could denote a resource which has *huge*
representations (e.g. a MPEG encoding of a movie) so the client is
forced to either do two requests (HEAD plus GET) or eat the whole
representation -- neither case is acceptable IMO.

SW requests should not incur the cost of multiple HTTP requests. One
shouldn't have to first query the server to find out how to or if
one can query the server for resource descriptions. It should be
possible to simply ask, and if the server doesn't understand the
request, you get a quick and unambiguous response indicating so.


If some server is not URIQA enlightened, it will complain. Simple.

BTW, the first open source release of the Nokia Semantic Web Server
implementation, with a full URIQA service implementation, is just
about ready. It is based on Intellidimension's RDF Gateway server.
A J2EE version for Tomcat/Jetty/etc. will be coming shortly as well.

Hopefully, that will help folks who are interested in approaches such
as URIQA by providing a working implementation as a starting point.

> hope to see some guidance coming from semWeb
> phase 2 (best practices, data access, ...?) soon.

Likewise. I see approaches such as URIQA and e.g. RDF Net API as
being complementary. URIQA is about requests to authoritative
servers based solely on the URI. RDF Net API provides for general
interaction with a knowledge server (3rd party knowledge).

Their relationship is very similar to that between HTTP and a
service such as DejaVu. If you want an authoritative representation
of a resource, you execute a GET request to the web authority
of the URI. If you want a representation of a resource from an
archive such as DejaVu, you specify the URI as a parameter (even
if the representation returned by DejaVu might have *once* been
authoritative, or even identitical to an existing authoritative
representation, it is not authoritative as it is not coming from
the web authority of the URI).

Similarly, if you want an authoritative description of a resource,
you execute an MGET request to the web authority of the URI. If
you want a description of the resource from some other knowledge
store, you specify the URI as a parameter. Etc.

Thus, we need *both* solutions such as URIQA and the RDF Net API,
just as we need both HTTP and web services.




Patrick Stickler
Nokia, Finland
Received on Thursday, 19 February 2004 03:09:44 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 22:44:47 UTC