- From: John Papandriopoulos <jpap@cs.rmit.edu.au>
- Date: Sat, 28 Apr 2001 22:25:11 -0400 (EDT)
- To: www-validator@w3.org
Hi, I couldn't see a method to crawl over an entire site and validate each of the pages on that site... instead of validating each individual page by hand. This would mean the validator would need to recursively visit each link on a site. A few things to consider would be: o A limit to the depth to which it visits links o Do not visit directories that are above the first document targetted o Do not visit links that are on different servers At the end of the validation, a summary would be presented for each document visited. This would be _very_ helpful!!! ta, John. -- John Papandriopoulos http://jpap.cjb.net/ 5th Year DD in Communications Eng/Computer Science jpap@cs.rmit.edu.au
Received on Sunday, 29 April 2001 05:47:30 UTC