Re: Combo Walker/Validator [was: Conformance ratings]

Daniel W. Connolly writes:

> In message <199602130336.TAA23804@server.livingston.com>, MegaZone writes:
> > (I'd like to find a checker I can have recurse our site to
> >check them all and just report errors.  I know my own code has some silly
> >things (When tired I sometimes close tags that don't require closing) but
> >moreso, some pages are done by someone in marketing and I find errors in
> >her HTML often enough for it to be a concern for me as Webmaster.
> >
> >The tools I've tried are one page at a time.
> 
> The HTML validation service is based on James Clark's sgml parser
                      ^^^^^^^^^^ services are :-)
> (available via www.jclark.com).
> 
> There are lots of web walkers. I'll attach one below.
> 
> 20 points to the folks that glue them together and get them to work
> for MegaZone (as an alpha tester) and eventually for everybody. Heck:
> stick a cheesy GUI on it, and from reading the trade rags, this would
> probably sell like hotcakes at $49/copy ;-)

I hacked something together at:

   http://ugweb.cs.ualberta.ca/~gerald/validate/walk.cgi

   (try "www.w3.org/hypertext/WWW/People", "home.mcom.com", or,
    for MegaZone, "www.livingston.com".)

but it doesn't to a true site-walk to get URLs; it just gets them from
Alta Vista (you type a pattern of URLs you want to look for).

And it doesn't actually retrieve and validate each URL (which would
be too expensive and/or time-consuming, I think); it just puts links
to my validation service. It might be handy for someone who maintains
a lot of documents, though: they can just save the HTML source of the
returned file and use it to validate URLs interactively.

I might do the true site-walking thing eventually...

Gerald
-- 
Gerald Oskoboiny  <gerald@cs.ualberta.ca>  http://ugweb.cs.ualberta.ca/~gerald/

Received on Monday, 19 February 1996 08:26:35 UTC