W3C home > Mailing lists > Public > www-validator@w3.org > May 2005

Re: checklink: Disable Robots

From: David Dorward <david@dorward.me.uk>
Date: Mon, 23 May 2005 17:58:44 +0100
To: www-validator@w3.org
Message-ID: <20050523165844.GC26805@us-lot.org>

On Mon, May 23, 2005 at 11:54:30AM -0500, Tim Burkhart wrote:

> I was wanting to disable robots because we have a huge website and are 
> wanting to run a link checker throughout the entire site to verify that 
> all of the links work. However, we want to ensure that no other robots 
> are allowed to access some of our files, so we still keep all of our 
> files and folders in robots.txt.

So what is wrong with my previous suggestion of simply explitly
allowing access to the entire site for Link Checker in your
robots.txt?

-- 
David Dorward                                      http://dorward.me.uk
Received on Monday, 23 May 2005 16:58:51 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 22:58:51 UTC