RE: [linking-data] What should I link to? (or link between)

Ø  If so, then this could be a valid LD coverage identifier.

Or perhaps ‘a valid identifier for a coverage delivered from a WCS service’.
We might want to think about whether a protocol-neutral identifier would make sense as well.

From: Jon Blower [mailto:j.d.blower@reading.ac.uk]
Sent: Thursday, 1 October 2015 1:03 AM
To: Peter Baumann <p.baumann@jacobs-university.de>
Cc: Clemens Portele <portele@interactive-instruments.de>; Cox, Simon (L&W, Clayton) <Simon.Cox@csiro.au>; Jeremy Tandy <jeremy.tandy@gmail.com>; public-sdw-wg@w3.org
Subject: Re: [linking-data] What should I link to? (or link between)

Hi Peter,

I’m showing my ignorance of the details of WCS here, but is the following URL resolvable by itself?

http://www.acme.com/wcs?SERVICE=WCS&VERSION=2.0&COVERAGEID=DtmGermany25


If so, then this could be a valid LD coverage identifier. But I thought (forgive me if I’m wrong) that you would need extra parameters to create a full DescribeCoverage or GetCoverage request that would be a valid request to the server.

PS2: I am also not arguing against RDF _descriptions_; storing a complete request is but one option. However, in order for a URI to be resolvable it invariably will need some final, down-to-earth access protocol, be it a file path, ftp, or a Web service. That was the point I meant to make. Sorry if I was unclear on this.

No that’s OK - I think we’re agreeing on this point.

But it’s worth noting that you don’t *always* need the underlying data. For example, imagine that you want to make an assertion that “this dataset has low quality in the South Atlantic”. In this case you want to describe the subset you are talking about. The URI then resolves to this description (probably an RDF document), not necessarily the data itself. You don’t really need access to the data to get value out of that assertion, although it can help of course!

Not sure many people will want to get an image as RDF triples - this is where Semantic World is opening itself to acknowledge foreign citizenships :)

I think you might have misunderstood me. Being a good “web citizen” doesn’t mean that everything has to be in RDF (I used RDF as an example only), but it does (in my view) involve using HTTP in a fairly standard way. My example was content negotiation - WxS uses a custom mechanism, but should (arguably) use the standard HTTP mechanism. It’s still fine to return NetCDF, GeoTIFF etc! (I’m sure these issues have been debated in the OGC-REST conversations.)

Cheers,
Jon



On 30 Sep 2015, at 15:10, Peter Baumann <p.baumann@jacobs-university.de<mailto:p.baumann@jacobs-university.de>> wrote:

Hi Jon,
On 2015-09-30 14:10, Jon Blower wrote:
Hi Peter,

Yes, I agree up to a point, but I think that a WxS URL is very unlikely to be persistent for a “long” time - this is not a criticism of WxS, but a recognition of the fact that access protocols change quite rapidly - versions change, underlying data change, etc.

I get your point, still I don't see evidence for such a general statement. As responded to Clements, INSPIRE services will stay, and
    http://www.acme.com/wcs?SERVICE=WCS&VERSION=2.0&COVERAGEID=DtmGermany25

will remain valid for quite a long time.
Underlying data change does not affect this URL, nor do format changes.

PS: just to restate: I am _not_ advocating WCS here, but resolvable service URLs in general, as such services will be the predominent future ecosystem.


Of course, this problem is not WxS-specific and is not solved simply by switching protocol. But there are mechanisms for organisations to mint and care for persistent URLs, and these don’t map well to custom query-oriented protocols (at least, not yet). This is why I think we can persistently store a *description* of a coverage subset, even if we can’t guarantee long-term persistent access to the coverage itself through a consistent protocol and endpoint. If we separately store the subset description (a relatively simple task, probably...) we can map to different access protocols and endpoints as they evolve to permit access.

PS2: I am also not arguing against RDF _descriptions_; storing a complete request is but one option. However, in order for a URI to be resolvable it invariably will need some final, down-to-earth access protocol, be it a file path, ftp, or a Web service. That was the point I meant to make. Sorry if I was unclear on this.



Also, there is the specific question of the behaviour of WxS, which don’t use standard HTTP mechanisms for content negotiation and other things. For example, ideally I would like a Linked Data URL to return RDF when I GET it with an appropriate Accepts header. I think this is part of what Ed was referring to when he talked about “good LD citizenship”. If WxS behaved more like the Web in general we might have fewer of these problems, but there is history in OGC dating back some time.

Not sure many people will want to get an image as RDF triples - this is where Semantic World is opening itself to acknowledge foreign citizenships :)

cheers,
Peter

Received on Thursday, 1 October 2015 02:18:40 UTC