W3C home > Mailing lists > Public > semantic-web@w3.org > April 2009

Re: [foaf-dev] [foaf-protocols] FOAF sites offline during cleanup

From: Hugh Glaser <hg@ecs.soton.ac.uk>
Date: Wed, 29 Apr 2009 21:50:45 +0100
To: Kingsley Idehen <kidehen@openlinksw.com>
CC: Peter Williams <pwilliams@rapattoni.com>, Semantic Web <semantic-web@w3.org>, foaf-dev Friend of a <foaf-dev@lists.foaf-project.org>
Message-ID: <EMEW3|ab921882eb7dc8d788cbb0ead60bce36l3SLov02hg|ecs.soton.ac.uk|ACD7%hg@ecs.soton.ac.uk>
Thanks Kingsley,

On 29/04/2009 12:20, "Kingsley Idehen" <kidehen@openlinksw.com> wrote:

> Hugh,
> I absolutely understand your concern.
Thanks mate.
> To cut a long story short, how would you suggest we describe what we
> have? 
I think not as in your earlier post :-) :
"We are now nearing complete stability re uploads, deletes, and data
cleansing activity re. the Virtuoso instance hosting the LOD Cloud [1]."
> What about the following:
> 1. A collection of most of the data from the LOD-Cloud pictorial
> 2.  LOD-Cloud sample
I would suggest you describe it is a "mirror of some of the LOD data".
This will get across the idea that it is only partial, and that it is of
necessity out of synchronisation with the original sources, but is clearly
more than a sample. (I don't know if it is most - have you done some
>> I really don't want to be reviewing/seeing papers in a few months time where
>> people are presenting analysis they claim to have done of the "LOD cloud" or
>> similar, and they have based their data gathering on the misconception that
>> all they have to do is look at your cloud.
> Neither do I, but I have expressly called out to everyone that has
> contributed to the LOD-Cloud (warehouse) to verify what's been loaded so
> far. Sadly, deafening silence until we make any kind of claim.
This is the way of the world - you seem to have some expectation that the
most valuable use of my time is to keep checking to see what you have put in
your system. As you say, this LOD work is non-trivial, and we all have a
zillion other things to do.
However, as you know, when you have asked or made claims I have sometimes
gone and looked, as I did this time; but it can be pretty time consuming to
go through someone else's store looking to sample to see how many of the
URIs you expect to find are not in fact there. And each time you ask it
becomes more of a chore.
But I think that actually the onus is on the claimant to do some of their
own analysis and justification before making the claims.
For example, you might go through our void:exampleResource (or even the ones
that you already have in your system) to sample how many of them for which
you have the rdf.
> As you
> know this work is non trivial (in all respects).
> It would be really sad if the easy part of providing dataset
> verification feedback for our instance becomes the reason for it to
> stagnate and ultimately wither away (we do have a zillion other things
> to do with our time, seriously).
> The goal of what we call the LOD-Cloud instance is to provide the Linked
> Data Web will a powerful faceted browsing and entity information lookup
> solution based on Linked Data. To date we haven't even seen DBpedia
> replicas let alone what we now have. Both are significant validators of
> the Linked Data Web in general.
> I can assure you, I didn't have academic papers in mind when
> commissioning either of these endeavors.
> Kingsley
Received on Wednesday, 29 April 2009 20:52:10 UTC

This archive was generated by hypermail 2.4.0 : Tuesday, 5 July 2022 08:45:11 UTC