W3C home > Mailing lists > Public > public-lod@w3.org > January 2017

Re: grlc turns your Linked Data queries into Linked Data APIs

From: Andreas Harth <andreas@harth.org>
Date: Mon, 30 Jan 2017 22:37:23 +0100
To: public-lod@w3.org
Message-ID: <a67ea5ad-ac68-bf64-a514-1dfa8eb42bc0@harth.org>
Hi Pieter,

interesting ideas!

I just wanted to add Linked Data-Fu to the list, which allows to
specify link traversals in a Notation3 syntax in combination with
a SPARQL query processor.

We'll run a tutorial about the system and related ideas at ESWC [2].


[1] https://linked-data-fu.github.io/
[2] http://harth.org/andreas/2016/eswc-tut/

On 01/26/17 16:13, Pieter Colpaert wrote:
> Hi Albert,
> Nice work! With The DataTank [1] we also released a similar feature back
> in 2012 that takes SPARQL templates at its input, and describes its
> output using DCAT-AP, the right HTTP headers for e.g., caching, and
> supports content negotiation. Next to also BASIL, also LimeDS [2]
> provides similar functionality, as a data adapter connecting various
> interfaces using a visual interface.
> I like these kinds of frameworks as they bridge the gap between
> publishing data as interoperable as possible – for maximum reuse – and
> front-end developers that want an app on top of a number of triples.
> They form an abstraction layer which can be used by front-end developers
> to quickly create a UI on top of data without having the take into
> account an open world assumption. Such frameworks are great tools for
> digital signage providers [3] and similar type of reuse that needs
> simple views, to take away some of the processing from a low-end device.
> To that extent, I would find it interesting if in the same way we could
> create an abstraction framework for more complex user agents. E.g., user
> agents that combine different data sources by crawling Linked Data using
> LDQL [4], the Linked Data Fragments client [5] or for the Linked
> Connections client for public transit route planning[6]. While when the
> caches of these type of user agents are cold, an end-user might have to
> wait some time for an answer, when the caches are hot – not unthinkable
> when all your end-users are asking very similar questions – results are
> probably going to be very fast.
> Kind regards,
> Pieter
> [1] https://github.com/tdt/core
> [2] http://limeds.be/
> [3] Back in 2012, I introduced The DataTank at this company and it did
> this trick well: https://flatturtle.com
> [4] http://olafhartig.de/files/HartigPerezLDQL_JWSPreprint.pdf
> [5] http://client.linkeddatafragments.org/
> [6] http://linkedconnections.org
> On 26-01-17 13:58, Albert Meroño Peñuela wrote:
>> Hi all,
>> Just letting you know that there is a public instance of grlc
>> available at [1]. No more hard-coded queries in your Linked Data
>> consuming applications!
>> grlc [5], inspired by tools like BASIL [4], is a small server that
>> converts your SPARQL queries into Linked Data APIs, automatically and
>> on the fly. To do this, it assumes that your SPARQL queries are
>> publicly available in a GitHub (or similar) repository. For example,
>> queries stored in https://github.com/CLARIAH/wp4-queries have their
>> equivalent API at http://grlc.io/api/CLARIAH/wp4-queries/api-docs
>> (notice the user and repository names in the URIs). You can call API
>> endpoints by e.g. http://grlc.io/api/CLARIAH/wp4-queries/datasets
>> Full details are described in this paper [2].
>> The latest additions include a docker-based deployment, parameter
>> enumerations, result pagination, and compatibility with #LD servers,
>> RDF dumps, and HTML+RDFa pages (besides SPARQL endpoints).
>> We would be pleased to hear from your experiences on using grlc: bugs,
>> performance, use cases, feature requests, etc. grlc's issue tracker
>> can be found at [3].
>> Thanks,
>> Albert
>> [1] http://grlc.io
>> [2]
>> https://www.albertmeronyo.org/wp-content/uploads/2016/04/SALAD2016_paper_4.pdf
>> [3] https://github.com/CLARIAH/grlc/issues
>> [4] http://basil.kmi.open.ac.uk/app/#/collection
>> [5] https://github.com/CLARIAH/grlc
Received on Monday, 30 January 2017 21:38:00 UTC

This archive was generated by hypermail 2.4.0 : Thursday, 24 March 2022 20:30:19 UTC