W3C home > Mailing lists > Public > www-lib@w3.org > January to March 1997

[Q] Which policy is best to build like robot program ?

From: Kazunori Kato <kkato@fantasy.otsl.oki.co.jp>
Date: Tue, 18 Feb 1997 13:43:23 GMT
Message-Id: <199702181343.NAA01636@pegasus.fantasy.otsl.oki.co.jp>
To: www-lib@w3.org
Hi, all.

I begin to learn the libwww recently and I have a question.

I want to build a program which 
    1.takes a URL(html) and keyword as argument.
    2.traces the html link tree(depth=2or3).
    3.collects the html file (which includes the keyword) to local disk.
    4.converts the link to local file link.
    5.highlights the keyword.

I think it the best quickly way to build the program is to 
improve "webbot"(Robot/src/HTRobot.c).

	*Is this correct ?*

Teach me the best quickly way, and how to convert the html file in the 
webbot source, please.
I've not understand how to use the filter and stream concept and
HText_XXXX of libwww yet.

I'm checking the "http://lists.w3.org/Archives/Public/www-lib/threads.html"


#As you know, I have English problem.

			    Kazunori Kato   kkato@otsl.co.jp
      http://www2b.meshnet.or.jp/~kkato (Last Update:'97/02/18)
Section-1/Software Research Department/Oki Technosystems Laboratory, Inc.
    Nagoya City, Japan  TEL +81 52 733 7271   FAX +81 52 733 9367
Received on Tuesday, 18 February 1997 08:38:40 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:33:47 UTC