Re: Think before you write Semantic Web crawlers

At 01:32 PM 6/23/2011, Sebastian Schaffert wrote:
>I am very well aware of the problem of adoption. At the same time, 
>we have a similar problem not only in the publication of the data 
>but also in the consumption: if we do not let users consume our data 
>even in large scale, what use is the data at all? I agree that 
>bombarding a server with crawlers just for harvesting as many 
>triples as possible without thinking about their use is stupid. But 
>it will always happen, no matter how many mails we have on the 
>Linked Data mailinglist.

Yes, however major conferences such as ESWC, ISWC, and WWW could 
define guidelines for
their paper submissions and actually reject papers that are based on 
denial of service attacks. It
became a trend to very much focus on size and invite people to 
evaluate their results in this
respect. In the same way we could define certain criteria that 
excludes dos attacks for achieving
this. Obviously it will not stop all people in the wild out there but 
it would at least prevent the
core of the academic semantic web community to burden their own 
technological achievements.

-- 
Dieter Fensel
Director STI Innsbruck, University of Innsbruck, Austria
http://www.sti-innsbruck.at/
phone: +43-512-507-6488/5, fax: +43-512-507-9872

Received on Thursday, 23 June 2011 11:43:05 UTC