Re: Feeding the webbot with a list of addresses to check

At 21:57 9/29/98 -0400, Bob Racko wrote:
>mostly when I want to have the robot check
>multiple URI's I run it multiple times
>from a script or bat file.
>
>I suppose we can change the interpretation
>of the second and subsequent arguments to mean
>additional URI's. Right now they mean keywords to search for.

Or maybe a line argument where the value is the name of a file with a list
of URIs?

>I personally would prefer (yet another) command line
>option along the lines of -addURI *
>which would add the named link to the list of initial documents to load.
>(The same as finding an <A href="*" > in a document except this link
>would name an alternate root or starting-point).
>
>This begs the issue of return-status. What does it mean to
>try to fetch one or more documents and they fail to fetch
>(or fail to parse, or fail to... ) ?

Using a "logic engine" may not be a bad idea - then you can say things
like: "get this document only of this and this document fails". Is there a
logic parser out there (with and, or, negate for a start) and would this be
a cool thing to do?

Thanks!

Henrik
--
Henrik Frystyk Nielsen,
World Wide Web Consortium
http://www.w3.org/People/Frystyk

Received on Wednesday, 30 September 1998 18:05:32 UTC