W3C home > Mailing lists > Public > www-html@w3.org > April 2004

Re: complexity

From: David Woolley <david@djwhome.demon.co.uk>
Date: Thu, 22 Apr 2004 07:45:28 +0100 (BST)
Message-Id: <200404220645.i3M6jSM02842@djwhome.demon.co.uk>
To: www-html@w3.org

> O'Reilly books, MSDN, Netscape documentation, etc.

The Netscape 4 documentation is not helpful as it conflicts with the
MSDN documentation.  One has to do something like take an intersection,
then set subtract the W3 DOMs.  The intersection is that bit that makes
a statement that there is compatibility self referential.

It was something that I found annoying about the Mozilla documentation,
a couple of years ago, that it claimed DOM 0 compliance but the only
way of finding out precisely what it meant by that would have been to
reverse engineer the code.

                                                       my point was that
> pages depend on scripting. It doesn't matter what kind of scripting.

As long as we mean pages as found on typical internet accessed sites,
I would agree with you that many break without enabled scripting
support (although the really successful web based businesses tend
not to break).

I don't think it is a good thing that so many pages break without
scripting, both from the complexity point of view - which breaches
a primary tenet of the CERN concept of the web, and without which
we would probably have PDF based sites - and from the security point of
view (if one followed reccommended practice, including that given
by Microsoft on several occasions, one would have scripting
disabled in IE for 10 to 20% of the time, and always if one couldn't
risk being hit by a new exploit, at least when accessing external
sites).
Received on Thursday, 22 April 2004 02:51:08 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Tuesday, 27 March 2012 18:16:00 GMT