Re: Indexing extension

It seems to me that a solution might lie in clever use of or extensions to
the robots.txt exclusion file that most spiders respect.  See

http://web.nexor.co.uk/mak/doc/robots/robots.html

if you're not familiar with this.

Our search engine can hide the existence of inaccessible documents from the
user; I would assume, though I'm not certain, that others can do so as
well.  For example, you could intercept our CGI data (between the Web
daemon and our search daemon) to delete the security restriction for
queries coming from NASA sites.

I'd be interested in hearing from others who are using Harvest.

Nick

Received on Sunday, 28 May 1995 10:34:30 UTC