RE: Web neurons

Fil,

Thank you very much for your reply - it was the first one and hopefully not
the last.

>>This is a proposal to add certain extensions to HTML to allow
>>Web pages themselves to become the central units  of a 
>>distributed (and local) programming language. 

>I'd use 'programming language' loosely here.. what you're proposing isn't
>a traditional programming language, yes? Languages have been almost
>exclusively linear-text based  up until now..

Yes, I agree, the only "language" part left would be the (hopefully) small
amount of code within the Web pages needed to process the inputs. I think a
mistake or two was made way back in assuming that linear-text was the way to
go. This could be due to our historic use of paper as a medium. Hypertext
has shown that linearity doesn't pay. Why can't we now apply the same
principle to computer Languages?

>>In other words Web pages would become like the subroutines, procedures, or
>>functions of other common languages, but with the important difference that
>>there would be no compulsory return to the source/caller. This is only an
>>outline proposal.

>You're really creating a new language.. is HTML the best place to do this? 

I don't really want to use HTML but it is there and probably will be for a
long while.  Also HTML is based on hyper-links which is an essential
ingredient. Don't tell anyone but I'm really using HTML to get the general
idea across. This aside I still think it could work. In fact I am more
interested in implementing the idea as an operating system in assembly language.

>I'm sure you could do anything with anything.. but should we?

This is difficult to answer. I guess common sense could be the best one?

>If we wanted to, we could make Z80 binaries the web standard. Everything
>distributed on the web had to run on a Z80 processor, which was emulated
>on each platform.

>The question is, are Z80 binaries the best way to represent/distribute things?

>(hoping nobody says yes here.. good)

Actually, one of the ideas in AI is that you use millions of very small and
cheap processors..... but I won't upset you by pursuing this.

>>FIRE passes the target  page URL, and inputs, to the target server or
>>calling machine/browser if the target page is local. The browser or server
>>must now also interpret the target page and may not even pass it back to the
>>caller.

>Sounds like structured programming in HTML? Why not use Java..?

I don't believe in structured programming. All that is happening is that
inputs are being passed from one place to another, there is no compulsory
return as in standard languages. In fact it is more like a "GOTO" than
anything structured. The structure is in the Web itself.

I don't know much about Java but I don't believe it operates within a Web
page which is the important point (I haven't got Windows 95 or NT).

>Don't tell me you actually LIKE HTML? :)

Hmmm.. I don't hate it like I do C and BASIC. It's simple to produce a quick
page or two. I wouldn't call it a language because it doesn't process - yet.

>>LOAD is included so that pages on hard disc likely to be fired can be loaded
>>into RAM in advance. They are still inactive though.

>>Huge programming tasks can be performed involving many computers in parallel.

>>Conventional programming script such as IF, ELSE, etc is now locked up and
>>local to its Web page where it can do less damage. Logic outside the Web
>>page is handled by the way Web pages are linked together, similar to a
>>neural network. 

>I think drawing a parallel between web links and neural networks is a bit
rich. 

OK the Web is not a neural-network because it doesn't process like one, but
the clear parallel is in the linking, and the potential is there for Web
page processing.

>The scenario you are painting IMHO would be more appropriate to distributed
>Java, where the tools are already there (or will be), and are easy to use

Why complicate by adding another language when all that is needed is some
modification to HTML. Integration is better than fragmentation.

>>Very short and simple use of IF, ELSE etc is recommended. The
>>architecture of the Web page links should control the main logic, i.e. like
>>a flow chart.

>Representing navigation in IF THEN/ELSE structures are dangerous. Why not
>just structure them simply as navigation, then infer the "IF THEN" from
>navigation.

Yes, I don't like IF, ELSE either. I don't understand what you mean about
"infer from navigation" - sounds interesting, please enlighten!

>>It would be nice to be able to view the system as a map and zoom in for
>>editing. So each page should have links to all its callers, targets, creator
>>page, index page, and any local home page(s),  to facilitate tracing and
>>editing. The map can be constructed from this information. Alternatively,
>>you could hop from page to page as usual.

>I think you're about to be bombarded with Hyper-G/Wave people telling you
>that it does do this mapping and zooming stuff. I'll leave this to them
cause >they know more.. :)

What is "Hyper-G/Wave"? Actually the above point of mine is not really
central, more of an editing facility, perhaps I shouldn't have mentioned it.

>>Why bother with over complex and fragmented languages like C, java
>>etc when the structure of the Web itself can handle the job
>>in a simpler and better way; not to mention the parallel and 
>>distributed aspects.

>To use Java you don't have to program it. Possibly one application would be
>to build structures as you describe, but there will always be a need for a
>scripting language, if only to do specialised atomic tasks (eg. Encrytption).

Yes but they could be called and controlled by HTML extensions. I would not
call my idea an application, it is more like an operating system.

>I think a Java application would handle the distributed and parallel
aspects >of your ideas much nicer. That's not to say that the actual CONTENT
needs to >be written in Java - possibly a meta-format that Java would then
render.

You know much more about Java than me. I really don't want to get Win 95...

>>For example in writing a database application each page could be one data
>>item of the database e.g "Smith". The  way the pages were linked would
>>determine the structure of the database and access to it.. There would be a
>>lot of very small pages using the power of the links!

>I think URL's are too expensive for that :)

I know it's a bit radical but the system can also be used on your local
machine without referring to external URL's. Aren't they planning to
dramatically increase the number of possible URL's soon. Something like
there will be 10 million URL's available per sq metre?  Over the next decade
or so  access speeds must surely increase hugely say with cable modems etc.

>Seriously, web 'pages' were designed to be just that - pages of
text/content. >Linear and all that stuff. 

Surely Web pages are non-linear because they have hyper-links. Maybe it's
also time they evolved from plain content.

>If you want to hack it to do more sophisticated stuff (like animation,
tables >etc.) then you can. If you want to completely change your philosophy
and say >that URL's (URI's.. whatever) are now 'objects', you're really
stretching >it. 

Surely all ideas should be considered on their merit alone rather that how
far they stretch from the norm. What about E=MC^2

>Goodbye backward compatibility.

The FIRE command and inputs can be appended to a standard URL request. If
there is no FIRE command then the target page is returned by the server as
normal therefore it should be backwards compatible. If there is a FIRE
command then the server behaves very differently.

>>Surely we are not utilising the power already waiting in the Web structure
>>itself.

>IMHO the web does not have any structure.. that's the problem. It only has
>'links' (whatever they are) between places. Who knows where they go, who
>knows why they are there.

Yes, information is missing, the link destination page should maybe contain
links back to all source pages.  Few do so you can get lost. You could still
get lost anyway. To constuct the map I was on about (centred on a particular
page) you do need the complete information so you can work back and forwards
from the page.

>> Hypertext works because it mimics the hyper-dimensional linking in
>>the brain. Its sucess is not to do with "text".  After all you can link from
>>jpg's too. But the brain also "processes" inside the neurons and this is not
>>mimiced on the Web. If we think of a neuron as a Web page then processing
>>needs adding within the page. A rough first shot at this has been taken
>>above.

>Why not use a neural net model to represent content/the web? Then you could
>build all these structures?

I don't know how to.

>Surely HTML is not a good environment to build nets. The only reason you'd
>use HTML is to be 'in' with the current web crowd. If you put nets up there
>in HTML, there's no point - cause nobody will be able to render them :).

I have little desire to be 'in'. As I said, HTML and it looks like it could
work is there so why not use it. By render do you mean program? If so then I
think programming my system would be easier than BASIC since you would be
dealing with maps (like flow diagrams) rather than cryptic lines of code.
The public can understand maps and diagrams.

>>The brain has been under development for X billion years

>And it's still not perfect.. I'm waiting to download the release version :)

Are there some bugs in your current model?

>>so shouldn't
>>we have tried to copy that ages ago instead of creating BASIC, C, FORTRAN,
>>and now JAVA!!! 

>.. and HTML :)

I wouldn't call HTML a language because it isn't (yet) active.

>Hope I didn't offend too much. I only meant to offend a little. :)

Only the bit about the Web crowd, but why do you like to offend at all?

Thanks  again for the reply,
---------------------------------------
John Middlemas
john@eco.powernet.co.uk

Received on Thursday, 23 May 1996 11:58:13 UTC