Re: Keeping crawlers up-to-date

Forced to mention RDFSync then (ISWC 2007)


Giovanni Tummarello, Christian Morbidoni, Reto Bachmann-Gmür, Orri Erling
"RDFSync: efficient remote synchronization of RDF models"
http://semanticweb.deit.univpm.it/papers/RDFSyncISWC2007.pdf

there was an implementation but it was just a proof of concept.
It works with bnodes as well (that's the gist of the matter actually, with
no bnodes it would be trivial). If there is REALLY interest then it could be
worked to the point in which it can be implemented in sparql somehow (Orri
was interested into this, personally i dont yet see such needs for frequent
updates but i might be wrong and or just lazy :-))

Giovanni



On Tue, Apr 28, 2009 at 4:15 PM, Sören Auer
<auer@informatik.uni-leipzig.de>wrote:

> Hi Yves, all,
>
> We envisioned publishing updates of LOD sources via a special LOD resource
> space on the LOD endpoint.
> The basic idea is to publish nested sets of updates as linked data for
> years, months, days, hours, minutes, seconds.
> This allows crawlers to only update resources which were recently changed.
> The idea is implemented and described for Triplify at:
>
> http://triplify.org/vocabulary/update
>
> There is also a section on that in the paper:
>
> Triplify - Lightweight Linked Data Publication from Relational Databases.
> Proceedings of WWW 2009.
> http://www.informatik.uni-leipzig.de/~auer/publication/triplify.pdf<http://www.informatik.uni-leipzig.de/%7Eauer/publication/triplify.pdf>
> http://www.slideshare.net/soeren1611/triplify-1341084
>
> Cheers,
>
> Sören
>
>
> --
>
> --------------------------------------------------------------
> Sören Auer, AKSW/Computer Science Dept., University of Leipzig
> http://www.informatik.uni-leipzig.de/~auer<http://www.informatik.uni-leipzig.de/%7Eauer>,
>  Skype: soerenauer
>
>

Received on Tuesday, 28 April 2009 16:05:33 UTC