RE: Mapping between a generic Web API and an LDP Server

On 29 Jan 2015 at 00:28, Miguel wrote:
> Thanks Tomasz and Markus for your warm welcome and the feedback!
> 
> On Tue, Jan 27, 2015 at 10:36 PM, Markus Lanthaler
> <markus.lanthaler@gmx.net> wrote:
>> 
>> On 27 Jan 2015 at 22:08, Tomasz Pluskiewicz wrote:
>> 
>>> On 2015-01-27 18:10, Miguel wrote:
>>>> Hi everybody,
>>>> I am working on a platform to define linked data applications using a
>>>> dataflow language based on SPARQL.
>> 
>> Could you elaborate a bit on this? Or provide some pointers to get a better
>> idea of what exactly you are working on?
> 
> The language basically allows the developer to define pipelines of RDF
> operators to build complex RDF transformations, that may also hold a
> state (e.g., to define interactive applications). The behavior of most
> of the nodes of the pipeline is defined through SPARQL 1.1 queries,
> that may also be dynamically generated by other nodes.
> It is still in a very experimental stage, but I was able to build some
> simple interactive visualizations with it.
> Some basics facts about the platform and links to publications and
> code are on swows.org

Cool... I assume it's not open sourced (yet)? In any case, you might be interested in trying to integrate that with NoFlo.


>>>> I have a problem that is not striclty in the scope of Hydra, but it
>>>> seems to me quite related.
>>>> I hope that I am not completely out of topic.
>>>> 
>>>> Let's say I want to build a server that have to comply with a given Web
>>>> API, that uses JSON,
>> 
>> So basically you want to build a server which exposes some interface in a
>> very specific way. For example you want to build something which does the
>> same as Twitter's or Facebook's API and also looks exactly the same. Is that
>> correct?
> 
> Exactly.
> 
>> 
>>>> and I want to implement it backed on an LDP server.
>> 
>> Hmm.. Not sure how I should interpret that as LDP defines an interface, not
>> a backend implementation.
> 
> Well, may be I should better say "backed on a server supporting the
> LDP interface".
> That is, the software I would like to generate from the mapping
> described below is a proxy/adapter between a client "speaking" that
> specific API (e.g., Twitter's API) and a server with an LDP interface.

I see. Well, for simple CRUD APIs it should be quite straightforward. You transform request bodies from the client to JSON-LD by injecting a predefined context. The methods should more or less match for most APIs. Otherwise you could write a simple proxy which does the translation for the client. The server responses can (in most cases) be transformed to the correct form by either just compacting [1] them with a specific context or, in more complex scenarios, by framing them [2] (framing isn't fully standardized yet but most JSON-LD processors support it nevertheless).


>>>> I can define a suitable Linked Data model and then map it from/to JSON
>>>> with a JSON-LD profile.
>>>> Then I have to map  somehow the operations supported by the Web API with
>>>> operations on LDP (assuming that the operations offered by the Web API
>>>> are simple enough to be mapped directly to LDP operations).
>> 
>> In other words, it just supports the CRUD operations.. right?
> 
> Exactly
> 
>>> I don't exactly understand the work flow. Could you please give some
>>> example?
> 
> A basic example is that I there is some existing client using a specific
> Web API. I want to (re)design the server based on a LDP interface for
> interoperability, but at the same time I want to keep using that old
> client.
> 
>>>> The question is: is there a standard way to do this second mapping?
>> 
>> Hydra does that. We even had the basic CRUD operations in Hydra (well, we
>> still have them in the spec but decided to drop them [1]) which allowed you to
>> do that. Now you have to either define your own operations or leverage
>> Schema.org Actions to describe the semantics of the operation.
> 
> Great, thank you for the pointer to the discussion.
> Is this usage of Schema.org Actions documented more in detail somewhere?

No, not yet. The assumption till now was that you simply type the Hydra operation also as a specific Schema.org Action. We might refine that tough.


>>>> I can do it ad-hoc in different server-side languages, but ideally I
>>>> would use some RDF vocabulary, like Hydra.
>> 
>> So you basically want to describe the Web API in RDF and generate the
>> implementation out of that?
> 
> Exactly
> 
>>>> As a fact Hydra seems very close to what I need, because it maps Web API
>>>> operations to their Linked Data meaning.
>>>> The main difference is that I would like to use this mapping to DEFINE
>>>> the behaviour of a Web API, while (if I understood its purpose
>>>> correctly) Hydra is used to DOCUMENT (to the client) the behaviour of a
>>>> Web API.
>> 
>> It should be quite straightforward to use a Hydra ApiDocumentation to auto-generate a
> server implementing the API - as long as the operations are simple enough. Effectlively it is
> the opposite of what the HydraBundle [2] is doing... even though it has some CRUD
> controller code generation which does most of the work for you.
> 
> Then I will definitely try it.
> 
>>> Hm, Hydra is used to describe a Web API using RDF terms, yes. So what do
>>> you mean by define? Like define so that it can be later implemented?
> 
> In the sense of being able to generate the implementation out of that
> definition.
> 
>>>> In general, it seems to me that such mapping could be useful to
>>>> integrate existing Web APIs with linked data workflows.
>> 
>> This probably needs a bit more explanation as well. What are
>> "linked data workflows"?
> 
> Well, I also have still only a vague idea of how it may work in
> general, but I will try to describe my view.
> 
> I used loosely the term "linked data workflows" to mean basically any
> system or agent interacting with linked data through standard
> protocols like the LDP interface, Hydra documented APIs, SPARQL
> endpoints, Linked Data Fragments interface, etc.
> 
> JSON-LD has been designed to bridge the gap between "semantic-less"
> JSON based applications and the linked data / RDF world. One great
> thing of JSON-LD is that by defining a profile you enable a two way
> conversion, from JSON to RDF and back.

Please don't confuse profiles [4] with contexts [5] - they serve different purposes.


> If you could do the same kind of thing with a whole Web API, I guess
> it would be much easier to "port" existing Web APIs to the "linked
> data world".
> Especially if from this mapping I could generate two adapters, one
> converting from Web API calls to LPD interface calls (or linked data
> version of the API, if there are operations more complex than CRUD),
> the other converting calls the other way around.
> 
> If, as an example, I define such a mapping for Twitter's API, I could
> then create a linked data application that integrates with the
> standard Twitter client and the Twitter server.
> In this "linked Twitter" application the tweets sent by the client go
> through an adapter to an LDP container (that is just an example, I do
> not know if the Twitter web client can be easily connected to a
> different server).
> Then some (linked data) agents do some fancy stuff, enriching my
> tweets with all sort of Linked Open Data and finally put the tweets in
> another LDP container. That container is just a façade of another
> adapter connected to the real Twitter server, so that my enriched
> tweets got finally published.
> 
> Pushing this scenario further, this Twitter's API-linked data mapping
> had to be defined just once to enable people to build their
> linked-twitter agents or more complex pipelines.

Hmm... you need to be careful here as you need a bidirectional mapping which might become difficult as you might lose information when going to LDP.

  --------                   ------------                   -------------
 | client |--( tw 2 ldp )-->| LDP server |--( ldp 2 tw )-->| Twitter API |
  --------                   ------------                   -------------


>>>> What do you think about it?
>>> 
>>> This should be possible with relative ease. Of course a kind of proxy or
>>> adapter would be necessary depending on your setup. Essentially if I
>>> understand correctly such use case is the main selling point of JSON-LD.
>>> First step would be to enhance each existing JSON response with
>>> @context, @type and @id. Assuming a good REST design, building the Hydra
>>> ApiDocumentation would then be accomplished as if the API was Linked
>>> from the beginning. And of course incoming payloads must be converted to
>>> JSON-LD if necessary and normalized by using the right @context.
>>> 
>>> Is that roughly what you have in mind?
> 
> Yes, that is exactly what I have in mind.
> But I would like somehow to generate this piece of software from a
> purely declarative definition of the mapping that could possibly
> consists only of:
> - the JSON-LD profile
> - the Hydra API documentation

As you know, Hydra wasn't really designed for this but I think you could bend it a bit to do what you want.


Please keep us posted on your progress,
Markus


[1] http://noflojs.org/
[2] http://www.w3.org/TR/json-ld-api/#compaction
[3] http://json-ld.org/spec/latest/json-ld-framing/
[4] http://www.w3.org/TR/json-ld/#iana-considerations
[5] http://www.w3.org/TR/json-ld/#the-context


--
Markus Lanthaler
@markuslanthaler

Received on Thursday, 29 January 2015 22:53:50 UTC