- From: John Middlemas <john@eco.powernet.co.uk>
- Date: Tue, 21 May 1996 04:10:24 +0100
- To: www-html@w3.org
This is a proposal to add certain extensions to HTML to allow Web pages themselves to become the central units of a distributed (and local) programming language. In other words Web pages would become like the subroutines, procedures, or functions of other common languages, but with the important difference that there would be no compulsory return to the source/caller. This is only an outline proposal. Suggested extensions to HTML to partly enable this. --------------------------------------------------- <FIRE href="target page URL" INPUTS="...", "...", "...", ...... > <ABORT href="target page URL"> the target page can be a local page on the calling machine or a page on another Web machine. <ARG cat picture> Input 1 is to be called cat picture <ARG my message> Input 2 is to be called my message etc <IF> </IF> <ELSE> <ELSEIF> <FOR> <WHILE> <NEXT> <WAIT REL=vble ABS=vble > <LOAD href="page URL"> <SAVE href="page URL"> FIRE passes the target page URL, and inputs, to the target server or calling machine/browser if the target page is local. The browser or server must now also interpret the target page and may not even pass it back to the caller. INPUTS could be anything - text, jpgs, other URLs, whole pages etc. They could also tell the target how to operate, e.g.. run in the background on the target machine, or perhaps also FIRE something back to the caller or to other pages. There is a possibility here for say "WebMail". A user (or program) could easily update/add to another Web site's pages automatically via the INPUTS. You would see your mail as Web pages on your server site, also with pictures, etc possible. The whole Internet system simplifies and Email servers are redundant. Security could be built in. Having started the target, the caller page can forget about it. The target may return itself to the caller as is done at present or it may itself FIRE off many other pages (in the background if required) so a chain reaction can easily be generated. This is similar to how the brain works. ABORT deactivates the target, although a page can also self-deactivate say after it had finished its task. It may often be necessary for a target page to take input from several callers before doing a task. So it is essential that variables are persistent even when the page goes to hard disc. All variables should be local to their page. Perhaps they could be stored on the end of the HTML. <WAIT REL=t > means that at time t after the page was last fired it will activate on its own for whatever purpose. <WAIT ABS=alarm > reactivates it at a specific absolute clock time and date. There are probably more variations required. LOAD is included so that pages on hard disc likely to be fired can be loaded into RAM in advance. They are still inactive though. Huge programming tasks can be performed involving many computers in parallel. Conventional programming script such as IF, ELSE, etc is now locked up and local to its Web page where it can do less damage. Logic outside the Web page is handled by the way Web pages are linked together, similar to a neural network. Very short and simple use of IF, ELSE etc is recommended. The architecture of the Web page links should control the main logic, i.e. like a flow chart. It would be nice to be able to view the system as a map and zoom in for editing. So each page should have links to all its callers, targets, creator page, index page, and any local home page(s), to facilitate tracing and editing. The map can be constructed from this information. Alternatively, you could hop from page to page as usual. Variables would have to be linked into various other HTML commands to enable the script to actually do anything. Many other HTML extensions would also be necessary for a complete programming language, e.g. setting a text INPUT into the HTML page proper. Why bother with over complex and fragmented languages like C, java etc when the structure of the Web itself can handle the job in a simpler and better way; not to mention the parallel and distributed aspects. For example in writing a database application each page could be one data item of the database e.g "Smith". The way the pages were linked would determine the structure of the database and access to it.. There would be a lot of very small pages using the power of the links! The above principles can be used to write an operating system so that the same simple method is used from bottom to top. This is the idea of Plexos (Plexus Operating System) on which I am currently working (a plexus is a network of nerves). It is possibly true that these principles should also be built into the manufacture of microprocessors. If so then it is sad this has not already been done. Surely we are not utilising the power already waiting in the Web structure itself. Hypertext works because it mimics the hyper-dimensional linking in the brain. Its sucess is not to do with "text". After all you can link from jpg's too. But the brain also "processes" inside the neurons and this is not mimiced on the Web. If we think of a neuron as a Web page then processing needs adding within the page. A rough first shot at this has been taken above. The brain has been under development for X billion years so shouldn't we have tried to copy that ages ago instead of creating BASIC, C, FORTRAN, and now JAVA!!! What do you think? --------------------------------------- John Middlemas john@eco.powernet.co.uk
Received on Monday, 20 May 1996 23:18:07 UTC