RE: Mapping between a generic Web API and an LDP Server

On 27 Jan 2015 at 22:08, Tomasz Pluskiewicz wrote:
> Hi Miguel, welcome to the list! Great to have you here

I second that. Welcome on board Miguel!


> On 2015-01-27 18:10, Miguel wrote:
>> Hi everybody,
>> I am working on a platform to define linked data applications using a
>> dataflow language based on SPARQL.

Could you elaborate a bit on this? Or provide some pointers to get a better idea of what exactly you are working on?


>> I am new to the Hydra spec and I am still sudying it.

Most important: don't hesitate to ask questions here


>> I have a problem that is not striclty in the scope of Hydra, but it
>> seems to me quite related.
>> I hope that I am not completely out of topic.
>> 
>> Let's say I want to build a server that have to comply with a given Web
>> API, that uses JSON,

So basically you want to build a server which exposes some interface in a very specific way. For example you want to build something which does the same as Twitter's or Facebook's API and also looks exactly the same. Is that correct?


>> and I want to implement it backed on an LDP server.

Hmm.. Not sure how I should interpret that as LDP defines an interface, not a backend implementation.


>> I can define a suitable Linked Data model and then map it from/to JSON
>> with a JSON-LD profile.
>> Then I have to map  somehow the operations supported by the Web API with
>> operations on LDP (assuming that the operations offered by the Web API
>> are simple enough to be mapped directly to LDP operations).

In other words, it just supports the CRUD operations.. right?


> I don't exactly understand the work flow. Could you please give some
> example?
> 
>> 
>> The question is: is there a standard way to do this second mapping?

Hydra does that. We even had the basic CRUD operations in Hydra (well, we still have them in the spec but decided to drop them [1]) which allowed you to do that. Now you have to either define your own operations or leverage Schema.org Actions to describe the semantics of the operation.


>> I can do it ad-hoc in different server-side languages, but ideally I
>> would use some RDF vocabulary, like Hydra.

So you basically want to describe the Web API in RDF and generate the implementation out of that?


>> As a fact Hydra seems very close to what I need, because it maps Web API
>> operations to their Linked Data meaning.
>> The main difference is that I would like to use this mapping to DEFINE
>> the behaviour of a Web API, while (if I understood its purpose
>> correctly) Hydra is used to DOCUMENT (to the client) the behaviour of a
>> Web API.

It should be quite straightforward to use a Hydra ApiDocumentation to auto-generate a server implementing the API - as long as the operations are simple enough. Effectlively it is the opposite of what the HydraBundle [2] is doing... even though it has some CRUD controller code generation which does most of the work for you.


> Hm, Hydra is used to describe a Web API using RDF terms, yes. So what do
> you mean by define? Like define so that it can be later implemented?
> 
>> 
>> In general, it seems to me that such mapping could be useful to
>> integrate existing Web APIs with linked data workflows.

This probably needs a bit more explanation as well. What are "linked data workflows"?


>> What do you think about it?
> 
> This should be possible with relative ease. Of course a kind of proxy or
> adapter would be necessary depending on your setup. Essentially if I
> understand correctly such use case is the main selling point of JSON-LD.
> First step would be to enhance each existing JSON response with
> @context, @type and @id. Assuming a good REST design, building the Hydra
> ApiDocumentation would then be accomplished as if the API was Linked
> from the beginning. And of course incoming payloads must be converted to
> JSON-LD if necessary and normalized by using the right @context.
> 
> Is that roughly what you have in mind?


[1] https://github.com/HydraCG/Specifications/issues/11
[2] http://bit.ly/HydraBundleGH



--
Markus Lanthaler
@markuslanthaler

Received on Tuesday, 27 January 2015 21:36:37 UTC