W3C home > Mailing lists > Public > semantic-web@w3.org > December 2010

Re: foaf-search.net with enhanced functionality

From: Michael Brunnbauer <brunni@netestate.de>
Date: Thu, 23 Dec 2010 17:51:41 +0100
To: semantic-web@w3.org
Message-ID: <20101223165141.GA5241@netestate.de>

re

On Thu, Dec 23, 2010 at 05:40:43PM +0100, William Waites wrote:
> Hi Michael, this is good news. But i have a question: is it possible
> to point you robot at a dump to prevent it mercilessly crawling large
> datasets like bnb.bibliographica.org? If so, how?

As we use named graphs for provenance tracking, I see no way to make use of
a dump. Our crawler waits at least 10 secs between two requests to the same
site. Of course I can block crawling of bnb.bibliographica.org if you want.
How many RDFs and pages with RDFa does it have ?

Regards,

Michael Brunnbauer

-- 
++  Michael Brunnbauer
++  netEstate GmbH
++  Geisenhausener Straße 11a
++  81379 München
++  Tel +49 89 32 19 77 80
++  Fax +49 89 32 19 77 89 
++  E-Mail brunni@netestate.de
++  http://www.netestate.de/
++
++  Sitz: München, HRB Nr.142452 (Handelsregister B München)
++  USt-IdNr. DE221033342
++  Geschäftsführer: Michael Brunnbauer, Franz Brunnbauer
++  Prokurist: Dipl. Kfm. (Univ.) Markus Hendel
Received on Thursday, 23 December 2010 16:52:11 GMT

This archive was generated by hypermail 2.3.1 : Tuesday, 26 March 2013 21:45:40 GMT