W3C home > Mailing lists > Public > public-hydra@w3.org > June 2014

RE: OLDM using the Hydra vocabulary for expressing property constraints

From: Markus Lanthaler <markus.lanthaler@gmx.net>
Date: Wed, 18 Jun 2014 13:34:57 +0200
To: <public-hydra@w3.org>
Message-ID: <02b501cf8ae9$563dd6a0$02b983e0$@gmx.net>
On 16 Jun 2014 at 12:36, Benjamin Cogrel wrote:
> Le 14/06/2014 22:46, Markus Lanthaler a écrit :
>> Do you intend to share this information with the client or will this just be used on the
>> server side? In the latter case, I'm wondering whether it wouldn't be
>> simpler to just create a mapping property-validator directly in code
>> when you instantiate the ResourceManager or Model!?
> 
> Yes, I intend to share this information with the client (except if the
> server wants to keep it secret). The client is then free to interpret it
> or not.

Fair enough.


> What seems nice with SPIN templated constraints is that they allows us
> to define in the same time new RDF properties and their automatic
> transformations into SPARQL queries (please note that it does not imply
> the presence of SPARQL endpoint; a query can be simply applied to the
> RDF representation of a resource). I expect to generate new validators
> (and their mappings) automatically from the definition of these
> properties. Such declarative approach would kill two birds with one
> stone :-)

Yeah, it's definitely a powerful approach and worth investigating further. If you experiment with it, please post a couple of examples to this list so that other people will get an idea of how it looks like (I don't think everyone on this list is familiar with SPIN). For validation, you might also want to have a look at ShEx:

 - http://www.w3.org/2013/ShEx/Primer
 - http://www.w3.org/2013/ShEx/Definition.html
 - https://www.w3.org/2001/sw/wiki/ShEx/RDF_serialization
 - https://www.w3.org/2001/sw/wiki/ShEx


>>> Yes, I agree, on the client side we cannot always assume that (i) a
>>> SPARQL endpoint is available, (ii) the server will make no validation
>>> and (iii) accept our local representation. I will propose an interface
>>> to abstract the use of a SPARQL endpoint. For me, implementations of
>>> this interface (e.g. LD Fragments or Hydra clients) should in charge of
>>> mapping the client and server representations. What do you think?
> >
>> The absence of SPARQL is one thing. The other thing I was talking about are entity
>> representations themselves. A client might already have a Python class representing a
>> person. When it retrieves the representation of a foaf:Person from the server, it somehow has
>> to map the data it got to that class. Obviously that mapping has to be bidirectional.
> 
> First, let me clarify a technical detail. By contrast with the other
> ORMs I know, there is no Python *class* representing a person in OldMan.

Yeah, I realize that. But if you write an application you will have such a class (already). If you start to interact with an API...

> There is one Model *object* that represents a RDFS class such as
> foaf:Person or my-example:LocalPerson [6]. A JSON-LD context + the

... you typically need to map the data you get from the Web API to such a class (instance).

> schema describing a RDFS class + an IRI generator should provide enough
> information to generate a new Model object in most cases.

Fully agreed. Nevertheless you need to import the data you get from the Web API to into your application's object graph in some form or another. That's why I talked about a mapper. This is primarily about clients accessing an API, not servers providing it.


> As a second step, we can relax the assumption that the (main) Web API
> controls the SPARQL endpoint and see this Web API as the client of other
> independent (sub) Web APIs. OldMan, as an OLDM, is the module of the
> main Web API that is in charge of CRUD operations*. This OLDM can now be
> seen as the client of a datastore where the latter may use a different
> representation that the one of the main Web API** and may enforce its
> own data validation. I will propose an interface between the core of
> OldMan and client modules for interacting with SPARQL endpoints, LDF and
> LDP servers, Hydra Web APIs, etc. These client modules will be in charge
> of the mapping between local and remote representations you discussed.
> As agents, they will execute the CRUD "goals" assigned by the core part.

OK, looks as we are more or less on the same page. Probably I separated client and server too much...


> You mentioned  the integration of non-CRUD hydra operations, this could
> be a third step.  Currently, the OLDM uses the Hydra description of the
> main Web API as the schema of its local representation. If non-CRUD
> operations should appear on Resource or Model objects, I think they
> should be operations provided (i) by the Web APIs the OLDM is client of
> or (ii) by a common abstraction of them. The latter abstraction would
> reduce the coupling. If I guess right, this would turn OldMan into a
> generic Hydra client library, isn't it?

Right. That's what I had in mind.


> However, one thing still
> confuses me: how can we obtain nice Python methods (like [7]) from these
> Hydra operations?

I'm not that much of a Python programmer but you could probably use magic methods:
  - _getattr__ to retrieve the right operation: myobject.CommentAction
  - __call__ on that operation object to make it callable: myobject.CommentAction("hi there")

Otherwise, using less magic, you could it similar to what I proposed in my last mail:

>>    article.perform("LikeAction")

> The CRUDController is a module I have quickly written some time ago for
> having a better understanding of the scope of this project.
> As it will be a main component of some CRUD Web API implementations, the
> relation with some Hydra operations is something that should be clarified.

Yep.


> Thank you Markus for this very interesting discussion,

Thank you!



> [1] http://linkeddatafragments.org/
> [2] https://github.com/HydraCG/Specifications/issues/40
> [3] https://github.com/oldm/OldMan/issues/9
> [4] http://semwebquality.org/ontologies/dq-constraints
> [5] http://oldman.readthedocs.org/en/latest/oldman.rest.html
> [6] https://github.com/oldm/OldMan/blob/master/examples/quickstart_schema.ttl
> [7] http://oldman.readthedocs.org/en/latest/oldman.html#oldman.model.Model.create
> [8] http://www.w3.org/blog/2011/05/hash-uris/


--
Markus Lanthaler
@markuslanthaler
Received on Wednesday, 18 June 2014 11:35:28 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 15:53:59 UTC