RE: Web Scripting Languages

Some responses:

1) I wrote in <2F5B1459@MSMAIL.INDY.TCE.COM>:
>One possible base for this work would be Safe-Tcl, Nathaniel Borenstein's
>and Marshall Rose's email scripting extension for John Ousterhout's Tcl/Tk. 

> Safe-Tcl uses a two-level interpreter, where the outer interpreter 
supports
>a carefully limited set of high-level capabilities.
It is a twin interpreter, with an untrusted interpreter that runs the 
Safe-Tcl scripts, while the other, trusted, interpreter can be used to 
extend the untrusted interpreter.  This is what I get for working from 
memory on a paper read months ago...

2) Mike Meyer wrote in <19950306.798A628.667C@contessa.phone.net>:
>> i) Be able to walk the Web on their own (travel from machine to machine);
>
>Robots or spiders, which have already been written using library
>facilities as discussed above.
Robots and spiders run on a host machine but operate on a set or series (or 
graph or ...) of other machines.  The proposed scripts would be able to move 
from machine to machine.  Rather than this model (the spider/robot model):
     Script on machine A
          Operate on machine B
          Operate on machine C
          Operate on machine D
          ...
this model (the autonomous agent model):
     Script on machine B, operating on machine B
     Script on machine C, operating on machine C
     Script on machine D, operating on machine D
     ...
could also be used for these scripts with either sequential or parallel 
execution of "Script" on B, C, D, ....  The first model is fine for when 
"avail. CPU cycles << network bandwidth" whereas the second model is 
appropriate for "avail. CPU cycles >> network bandwidth", while it's a 
toss-up when available CPU cycles are close to the network bandwidth.  As 
Nathaniel Borenstein wrote:
>I've argued for a procedural language (safe-tcl or something
>better, if it comes along) because I want people to be able to do the
>maximal possible number of things safely.  It isn't that I'm not sure
>what *I* want to do, it's that I am absolutely sure that nobody knows
>what *everybody* will want to do.  For that reason, my focus has been on
>providing the maximum amount of expressive power that is compatible with
>a safe language for untrusted scripts.
The autonomous agent model at the least provides a second method for 
performing Web operations that may better fit the available resources (CPU 
cycles, network bandwidth, ...).  There are cases (network bandwidth 
constraints again, for one) where a locally executing agent might be allowed 
greater permissions than a remotely executing spider.


3) Mike Meyer also wrote:
>> I suggest the name "Spider" for this Safe-Tcl extension.
>
>That name is already in use as an alias for web-wondering robots.
Agreed.  How about "TclWeb", then?


======================================================================
Mark Fisher                            Thomson Consumer Electronics
fisherm@indy.tce.com                   Indianapolis, IN

"Just as you should not underestimate the bandwidth of a station wagon
traveling 65 mph filled with 8mm tapes, you should not overestimate
the bandwidth of FTP by mail."

Received on Wednesday, 15 March 1995 08:05:47 UTC