W3C home > Mailing lists > Public > public-lod@w3.org > June 2011

Re: Think before you write Semantic Web crawlers

From: Martin Hepp <martin.hepp@ebusiness-unibw.org>
Date: Tue, 21 Jun 2011 12:42:45 +0200
Cc: public-lod@w3.org
Message-Id: <182F28C7-0D7A-443A-8F6A-5B921515BAED@ebusiness-unibw.org>
To: Kingsley Idehen <kidehen@openlinksw.com>
Yes, RDF data dumps without traffic control mechanisms are an invitation to denial-of-service attacks.

On Jun 21, 2011, at 12:28 PM, Kingsley Idehen wrote:

> On 6/21/11 11:23 AM, Kingsley Idehen wrote:
>> A looong time ago, very early LOD days, we (LOD community) talked about the importance of dumps with the heuristic you describe in mind (no WebID then, but it was clear something would emerge). Unfortunately, SPARQL endpoints have become the first point of call re. Linked Data even though SPARQL endpoint only == asking for trouble if you can self protect the endpoint and re-route agents to dumps. 
> Critical typo fix:
> 
> A looong time ago, very early LOD days, we (LOD community) talked about the importance of dumps with the heuristic you describe in mind (no WebID then, but it was clear something would emerge). Unfortunately, SPARQL endpoints have become the first point of call re. Linked Data even though SPARQL endpoint only == asking for trouble if you *can't* self protect the endpoint and re-route agents to dumps.
> 
> -- 
> 
> Regards,
> 
> Kingsley Idehen	
> President&  CEO
> OpenLink Software
> Web: http://www.openlinksw.com
> Weblog: http://www.openlinksw.com/blog/~kidehen
> Twitter/Identi.ca: kidehen
> 
> 
> 
> 
> 
> 
Received on Tuesday, 21 June 2011 10:43:10 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 16:21:13 UTC