W3C home > Mailing lists > Public > public-lod@w3.org > June 2011

Re: Think before you write Semantic Web crawlers

From: Lin Clark <lin.w.clark@gmail.com>
Date: Wed, 22 Jun 2011 22:01:06 +0100
Message-ID: <BANLkTiksHvE5FXs0fti7cf_U0vgJcGhWeQ@mail.gmail.com>
To: Sebastian Schaffert <sebastian.schaffert@salzburgresearch.at>
Cc: Martin Hepp <martin.hepp@ebusiness-unibw.org>, public-lod <public-lod@w3.org>
On Wed, Jun 22, 2011 at 9:33 PM, Sebastian Schaffert <
sebastian.schaffert@salzburgresearch.at> wrote:

> Your complaint sounds to me a bit like "help, too many clients access my
> data".

I'm sure that Martin is really tired of saying this, so I will reiterate for
him: It wasn't his data, they weren't his servers. He's speaking on behalf
of people who aren't part of our insular community... people who don't have
a compelling reason to subsidize a PhD student's Best Paper award with their
own dollars and bandwidth.

Agents can use Linked Data just fine without firing 150 requests per second
at a server. There are TONs of use cases that do not require that kind of
server load.

Lin Clark
DERI, NUI Galway <http://www.deri.ie/>

Received on Wednesday, 22 June 2011 21:01:42 UTC

This archive was generated by hypermail 2.4.0 : Thursday, 24 March 2022 20:29:54 UTC