RE: Web neurons

>This is a proposal to add certain extensions to HTML to allow
>Web pages themselves to become the central units  of a 
>distributed (and local) programming language. 

I'd use 'programming language' loosely here.. what you're proposing isn't
a traditional programming language, yes? Languages have been almost
exclusively linear-text based  up until now..

>In other words Web pages would become like the subroutines, procedures, or
>functions of other common languages, but with the important difference that
>there would be no compulsory return to the source/caller. This is only an
>outline proposal.

You're really creating a new language.. is HTML the best place to do this? I'm
sure you could do anything with anything.. but should we?

If we wanted to, we could make Z80 binaries the web standard. Everything
distributed on the web had to run on a Z80 processor, which was emulated
on each platform.

The question is, are Z80 binaries the best way to represent/distribute things?

(hoping nobody says yes here.. good)

>FIRE passes the target  page URL, and inputs, to the target server or
>calling machine/browser if the target page is local. The browser or server
>must now also interpret the target page and may not even pass it back to the
>caller.

Sounds like structured programming in HTML? Why not use Java..?

Don't tell me you actually LIKE HTML? :)

>LOAD is included so that pages on hard disc likely to be fired can be loaded
>into RAM in advance. They are still inactive though.

>Huge programming tasks can be performed involving many computers in parallel.

>Conventional programming script such as IF, ELSE, etc is now locked up and
>local to its Web page where it can do less damage. Logic outside the Web
>page is handled by the way Web pages are linked together, similar to a
>neural network. 

I think drawing a parallel between web links and neural networks is a bit rich. The
scenario you are painting IMHO would be more appropriate to distributed Java,
where the tools are already there (or will be), and are easy to use.

>Very short and simple use of IF, ELSE etc is recommended. The
>architecture of the Web page links should control the main logic, i.e. like
>a flow chart.

Representing navigation in IF THEN/ELSE structures are dangerous. Why not
just structure them simply as navigation, then infer the "IF THEN" from navigation.

>It would be nice to be able to view the system as a map and zoom in for
>editing. So each page should have links to all its callers, targets, creator
>page, index page, and any local home page(s),  to facilitate tracing and
>editing. The map can be constructed from this information. Alternatively,
>you could hop from page to page as usual.

I think you're about to be bombarded with Hyper-G/Wave people telling you that
it does do this mapping and zooming stuff. I'll leave this to them cause they know
more.. :)

>Why bother with over complex and fragmented languages like C, java
>etc when the structure of the Web itself can handle the job
>in a simpler and better way; not to mention the parallel and 
>distributed aspects.

To use Java you don't have to program it. Possibly one application would be to
build structures as you describe, but there will always be a need for a scripting
language, if only to do specialised atomic tasks (eg. Encrytption).

I think a Java application would handle the distributed and parallel aspects of your
ideas much nicer. That's not to say that the actual CONTENT needs to be written
in Java - possibly a meta-format that Java would then render.

>For example in writing a database application each page could be one data
>item of the database e.g "Smith". The  way the pages were linked would
>determine the structure of the database and access to it.. There would be a
>lot of very small pages using the power of the links!

I think URL's are too expensive for that :)

Seriously, web 'pages' were designed to be just that - pages of text/content. Linear
and all that stuff. If you want to hack it to do more sophisticated stuff (like animation,
tables etc.) then you can. If you want to completely change your philosophy and say
that URL's (URI's.. whatever) are now 'objects', you're really stretching it. Goodbye
backward compatibility.

>Surely we are not utilising the power already waiting in the Web structure
>itself.

IMHO the web does not have any structure.. that's the problem. It only has 'links'
(whatever they are) between places. Who knows where they go, who knows why
they are there.

> Hypertext works because it mimics the hyper-dimensional linking in
>the brain. Its sucess is not to do with "text".  After all you can link from
>jpg's too. But the brain also "processes" inside the neurons and this is not
>mimiced on the Web. If we think of a neuron as a Web page then processing
>needs adding within the page. A rough first shot at this has been taken
>above.

Why not use a neural net model to represent content/the web? Then you could
build all these structures?

Surely HTML is not a good environment to build nets. The only reason you'd use
HTML is to be 'in' with the current web crowd. If you put nets up there in HTML,
there's no point - cause nobody will be able to render them :).

If you're going to use a new renderer, then why don't you use something other
than HTML?

>The brain has been under development for X billion years

And it's still not perfect.. I'm waiting to download the release version :)

>so shouldn't
>we have tried to copy that ages ago instead of creating BASIC, C, FORTRAN,
>and now JAVA!!! 

.. and HTML :)

Hope I didn't offend too much. I only meant to offend a little. :)

Later..

Received on Tuesday, 21 May 1996 00:00:40 UTC