- From: Svensson, Lars <L.Svensson@dnb.de>
- Date: Mon, 18 May 2015 14:56:05 +0000
- To: Martynas Jusevičius <martynas@graphity.org>
- Cc: Kingsley Idehen <kidehen@openlinksw.com>, "public-lod@w3.org" <public-lod@w3.org>
Martynas, On Monday, May 18, 2015 3:14 PM, Martynas Jusevičius wrote: > what you describe here is a classic case of data quality control. You > don't want any data to enter your system that does not validate > against your constraints. Yes, that is one use case. > As mentioned before, SPARQL and SPIN has been used for this purpose > for a long time. There are readily available constraint libraries: > http://semwebquality.org/ontologies/dq-constraints. But you can easily > create custom ones since they're just SPARQL queries. Constrains can > be (de)referenced from remote systems as well. OK, what I haven't understood yet is how a client and a server can negotiate the constraints the client wants the data to meet. Given is a server that has no SPARQL endpoint but is capable to serve RDF conforming to two profiles/shapes/preferences "profile:A" and "profile:B" (possibly identified by the URIs http://example.com/profiles/A and http://example.com/profiles/B). When a client wants data adhering to profile:B in text/turtle, what would the http GET request look like, and what would you get when you dereference http://example.com/profiles/B with "Accept: text/turtle"? > Our Graphity Linked Data platform provides a SPIN validator which > checks every incoming RDF request: > http://graphityhq.com/technology/graphity-processor#features Nice, but my case is not only about validation. It's also about having a way to describe the constraints in a fashion that clients and servers can understand. If I understand correctly, you say that SPIN is the best way of doing that. Best, Lars
Received on Monday, 18 May 2015 14:56:35 UTC