Re: validation service

Bjorn wrote:

> I am a spare time webmaster, of http://vintagepc.viralnet.org
[...]
> 50-60 pages

That's AFAIK still within the limits of the WDG validator:

http://www.htmlhelp.com/cgi-bin/validate.cgi?url=http%3A%2F%2Fvintagepc.viralnet.org&warnings=yes&spider=yes&hidevalid=yes

If this long line makes no sense for you here are the pieces:

http://www.htmlhelp.com/cgi-bin/validate.cgi?
url=http%3A%2F%2Fvintagepc.viralnet.org&
warnings=yes&spider=yes&hidevalid=yes

You can tune which pages it should ignore in your robots.txt,
the name of the spider is WDG_SiteValidator - for details see
<http://www.htmlhelp.com/tools/validator/tips.html#robotstxt>

Working example copied from my robots.txt:

# Link check by <URL:http://www.htmlhelp.com/tools/valet/key.html>

User-agent: Link Valet Online
Disallow:

# WDG HTML validator <URL:http://uk.htmlhelp.com/tools/validator/>
# upto 60 pages are checked, therefore some stuff excluded here

User-agent: WDG_SiteValidator
Disallow: /err40
Disallow: /.

> This would probably be a convenience for everyone.

So far I didn't need it, but for the W3C checklink you could
create several pages enumerating links to all other pages, and
then use something like "max_depth=1" (sorry, I've forgotten
the exact syntax, I was never happy with the online versions
of the W3C and WDG link checkers, last test 2004).  Bye, Frank

Received on Sunday, 17 April 2005 15:27:30 UTC