W3C home > Mailing lists > Public > ietf-http-wg-old@w3.org > May to August 1995

Re: Indexing extension

From: Nick Arnett <narnett@verity.com>
Date: Sun, 28 May 1995 10:29:15 -0700
Message-Id: <abee613702021004fb2d@[192.187.143.12]>
To: http-wg%cuckoo.hpl.hp.com@hplb.hpl.hp.com
Cc: harvest-dvl@cs.colorado.edu, naic@nasa.gov, webmasters@nasa.gov
It seems to me that a solution might lie in clever use of or extensions to
the robots.txt exclusion file that most spiders respect.  See

http://web.nexor.co.uk/mak/doc/robots/robots.html

if you're not familiar with this.

Our search engine can hide the existence of inaccessible documents from the
user; I would assume, though I'm not certain, that others can do so as
well.  For example, you could intercept our CGI data (between the Web
daemon and our search daemon) to delete the security restriction for
queries coming from NASA sites.

I'd be interested in hearing from others who are using Harvest.

Nick
Received on Sunday, 28 May 1995 10:34:30 EDT

This archive was generated by hypermail pre-2.1.9 : Wednesday, 24 September 2003 06:31:22 EDT