Re: HTML 3.2 [was: Unique Names & content scope -Reply ]

Paul Prescod (papresco@itrc.uwaterloo.ca)
Tue, 7 May 1996 19:20:53 -0400


Date: Tue, 7 May 1996 19:20:53 -0400
Message-Id: <199605072320.TAA17457@itrc.uwaterloo.ca>
To: "Daniel W. Connolly" <connolly@beach.w3.org>
From: Paul Prescod <papresco@itrc.uwaterloo.ca>
Subject: Re: HTML 3.2 [was: Unique Names & content scope -Reply ] 
Cc: www-html@w3.org

At 04:33 PM 5/7/96 -0400, Daniel W. Connolly wrote:
>I'm sorry you feel this way. HTML 3.2 is something of a thowback.
>It's descriptive of current practice. 

What is the point?

I sometimes use a Word Processor whose file format is not only not Word
compatible, it has a constantly changing file format owned by a single
company. Fine. I lose out on compatibility. I lose out on interoperability.
I lose out on technical competence. BUT, because a particular word processor
has won the "support, marketing, PR, wall-street" war, I get
interoperability _anyways_. RTF and the Word file format are "de facto"
standards. Microsoft invented them and everyone else reverse engineered
them. There doesn't need to be an "industry consortium" to ensure
compatibility. Whether W3C "standardizes" Netscapeisms or not, mass-market
browsers will all support them, in roughly the same way. "The market"
doesn't need W3Cs help.

When I write some documents, a large and diverse audience is important to
me. I want it to be easy to write the document for this large and diverse
audience. I care about technical competence. In this case, I do need W3Cs
help. I need smart people to get together and come up with a smart standard.
I can't depend on vendors because they don't care about technical
competence. I need a smart standard to be mandated as a minimum (only a
minimum!) for compliant applications.

Widespread support for <CENTER> and <FONT> was inevitable. Support for
<!DOCTYPE > will only come about if we make it mandatory.

"But what if W3C becomes a sideline player?" 

If W3C and IETF become sideline players, True HTML will go "under ground"
(as SGML is "under ground", as UNIX is "under ground" as the Mac is "under
ground") and over the years people will point to them as the Better Way of
Doing Things (as those systems are pointed to) and slowly they will gain
currency, or influence the de facto standards without becoming that which
they are supposed to replace: a mess.

"I was only following the market" sounds a lot like "I was only following
orders". If you know that a compromise is Bad For The Language, you should
just not do it. If the language is worth anything at all, people will come
around eventually. If it isn't, to hell with it. Let the "market" and "de
facto" standards rule.

SGML suffers from that "placate the vendors and ignorant users" mentality to
a certain extent, and we are paying the price every day. In retrospect the
"vendors" and "users" from the days when SGML was young were a small
fraction of the "vendors" and "users" of today. So it will be with HTML.
Just when we are finish building a massive creaky, broken, information
system around a creaky, broken language, we will have to jettison it because
it will not support the weight of the world's expectations. And if we don't
jettison it, the "market" will. The market looks after itself.

The difference between "de facto" standards and "de jure" standards should
be technical quality. The market will vote in favour of technical quality
eventually. (which is why the Pentium Pro looks more like a RISC chip than
an 8086 and Windows NT works more like Unix 75 than Windows 95).

We must not destroy the "better way." (sheesh, am I really talking about
HTML???)

 Paul Prescod