W3C home > Mailing lists > Public > www-validator@w3.org > October 1999

RE: [www-validator]

From: Gary Alderman <galderman@intelink.gov>
Date: Fri, 29 Oct 1999 18:04:46 -0400 (EDT)
Message-ID: <36FFA7EA9FE1D211A25A00C04F68F86F0395FC@jetsam.ncsc.mil>
To: www-validator@w3.org

> -----Original Message-----
> From: molewis@us.ibm.com [mailto:molewis@us.ibm.com]
> Sent: Friday, October 29, 1999 4:42 PM
> To: www-validator@w3.org
> Subject: [www-validator] <none>
> Hello,
> I am trying to find a solution to using the validator 
> on some 100,000 files!  Is there anyway that anyone 
> knows of to check more than one file at a time.  
> I would appreciate any information or advice.
> Thanks,
> Mark Lewis
> molewis@us.ibm.com
Mark -- I'm chuckling...
I doubt such a huge job is appropriate for the public 
W3C web site.  (It probably would amount to a 
"denial-of-service attack" for the rest of us.)  
But let's hear from Gerald or others actually in the W3C!! 
(It's their service after all.)

My recommendation would be to install a copy on your own
server.  If you've read the archives here, it's doable,
but may not be for the faint-hearted.

You can then easily hack the perl source of "check" 
to take a file of URLs as input rather than the
one-at-a-time CGI form and produce reports on each
URL.  In fact, using the CPAN perl library stuff, you
could tie it to a "web walker" to roam your space and
find its own input.  You could then hack "check" to 
write the report files wherever you wish.  Prepare 
yourself for a staggering volume of output.

I've been tempted to do this for a network myself, but
I've never quite summoned enough courage.

Good luck, 
Gary Alderman
Received on Friday, 29 October 1999 18:10:30 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 1 March 2016 14:17:26 UTC