- From: David Dorward <david@dorward.me.uk>
- Date: Mon, 23 May 2005 17:58:44 +0100
- To: www-validator@w3.org
On Mon, May 23, 2005 at 11:54:30AM -0500, Tim Burkhart wrote: > I was wanting to disable robots because we have a huge website and are > wanting to run a link checker throughout the entire site to verify that > all of the links work. However, we want to ensure that no other robots > are allowed to access some of our files, so we still keep all of our > files and folders in robots.txt. So what is wrong with my previous suggestion of simply explitly allowing access to the entire site for Link Checker in your robots.txt? -- David Dorward http://dorward.me.uk
Received on Monday, 23 May 2005 16:58:51 UTC