[Q] Which policy is best to build like robot program ?

Hi, all.

I begin to learn the libwww recently and I have a question.

I want to build a program which 
    1.takes a URL(html) and keyword as argument.
    2.traces the html link tree(depth=2or3).
    3.collects the html file (which includes the keyword) to local disk.
    4.converts the link to local file link.
    5.highlights the keyword.

I think it the best quickly way to build the program is to 
improve "webbot"(Robot/src/HTRobot.c).

	*Is this correct ?*

Teach me the best quickly way, and how to convert the html file in the 
webbot source, please.
I've not understand how to use the filter and stream concept and
HText_XXXX of libwww yet.

I'm checking the "http://lists.w3.org/Archives/Public/www-lib/threads.html"

Regards,

#As you know, I have English problem.

			    Kazunori Kato   kkato@otsl.co.jp
      http://www2b.meshnet.or.jp/~kkato (Last Update:'97/02/18)
Section-1/Software Research Department/Oki Technosystems Laboratory, Inc.
    Nagoya City, Japan  TEL +81 52 733 7271   FAX +81 52 733 9367

Received on Tuesday, 18 February 1997 08:38:40 UTC