Re: CSS is doomed (10 years per version ?!?)

On 6/30/05, Ian Hickson <ian@hixie.ch> wrote:
> 
> On Thu, 30 Jun 2005, Orion Adrian wrote:
> >
> > Point granted. However, until I actually see 2.1 implemented fully in
> > IE, it's a dead standard as long as IE retains its position as 80%+.
> 
> The whole point of the original statement that you took issue with (namely
> that it takes 10 years for a spec to from concept to wide deployment) is
> that a standard is dead so long as it isn't implemented.

1) The specs take too long to reach the public. The iterative design
process doesn't really work when you've got 10 years between start and
implementation.

2) Because it takes so long, there is no iterative design process. The
language suffers. Major players whose job it is to sell easy to use
development software abandon the standard to promote their own more
usable version.

I hope that's clear enough.

> > > So what, you want Web browsers to implement five dozen styling
> > > languages? Implementors will never do that. Implementors are already
> > > reluctant to implement new features, let alone new languages; if
> > > entire new languages were being released every other year, replacing
> > > the previous ones, the standardisation process would completely fail.
> >
> > Exaggeration doesn't make your case. I think they would want 4 languages
> > which are all simple over the three (HTML, CSS, JavaScript) they
> > currently have.
> 
> But by your model, all three of those languages would have been replaced
> three or four times by now (we're 16 years into the Web's life). So that's
> at least 9 to 12 languages just to get to today.

Huh? No, I'm not for replacing languages for the sake of doing so. I'm
for splitting and merging languages when it will improve the overall
experience. If layout doesn't work well from the persepective of CSS,
then move it out. If some other language's features make sense to be
included, merge them. I'm for the optimal number of languages, not for
replacement at a whim.

> > Simplicity wins in my book over complexity. If I can remove the layout
> > properties of CSS, I've greatly simplified the language and then it
> > becomes both easier to implement and easier to write for.
> 
> It's not easier to implement since, as you point out, you still have to
> implement the older version (so that existing content that uses those
> features continues to work).

The older version however is already implemented. You don't even
really have to maintain it, except for security since you want it to
work exactly as it's always worked.

The new version is now capable of creating new behavior, since people
developing for the new version will see the new behavior during the
development process.

> > In your mind how many languages is the target? 1? I'd hate to see the
> > monstrosity that woud be one monlithic language.
> 
> Let's see. Unicode, UTF-8, XML, XHTML, CSS, DOM, JavaScript, PNG, JPEG. So
> that would be about nine (orthogonal) specifications.

Graphics, Audio, Video, and other inbedded documents aren't really the
issue here.
DOM is a library, not a language. People learn JavaScript with the DOM
library there.
XML is a sub-set of XHTML (they're only learning XHTML in their mind).
UTF-8 is only cared about by spec writers. Most people just leave the
encoding on the default.

So as I see it you have, XHTML, JavaScript, CSS. I'm proposing one more.
 
> > > Not to mention that each language would need a whole separate test
> > > suite (who is going to write that? We have enough trouble getting test
> > > suites written for the "old and crusty" CSS specs).
> >
> > Separate test suites that are each simpler to write than the previous
> > ones. N-tier has been very popular because it abstracts each layer
> > allowing each layer to match the mental models of the people using it.
> > Each layer also provides the tools necessary to accomplish the tasks
> > that need to be done without having to worry about the whole (a very
> > cumbersome problem).
> 
> I have no idea what you are trying to say here.
> 
> Writing test suites takes years. It's been my professional career for
> several years now. There is no easy way out. Even simple specs like xml:id
> need large test suites; anything near the complexity of a rendering spec
> (e.g. one that includes the Unicode bidi algorithm) involves tens of
> thousands of tests.

I used to write test suites in previous jobs as well and I can tell
you that it's much easier to test many simple non-interactive things,
than to test one thing that forces interaction. It's the whole point
behind Interfaces (contract-based programming).

> > > And of course if you keep replacing languages, you're very quickly
> > > going to lose the interest of Web authors, who, by and large, have
> > > enough trouble learning one language without having to start over
> > > every other year. They would just stick with what they knew and ignore
> > > the new languages.
> >
> > The consistency argument only wins when you have a winning solution
> > already, but we don't.
> 
> I didn't say anything about consistency. I said authors don't like
> learning new languages.

There is an implicit consistency argument made any time you are
talking about change. Consistency usually wins, but only when the
current product is good.

> > CSS layout isn't where it needs to be. It was a failed experiment and
> > saying, let's not rework the problem because authors might have to
> > adjust to not being stabbed in the eye (my own personal exaggeration),
> > is a flawed argument. Yes users will have to learn where the new
> > knobbies are, but they'll like the end result.
> 
> Maybe the first time (HTML2 to <font>) or the second (<font> to CSS) but
> the third or fourth time, they will give up. (XSLFO?)

People have successfully managed to migrate from version to version of
software over the years and managed to do alright. They manage to
adjust to changing procedures in their job. It's actually quite
healthy and prevents boredom in the workplace. I'm not suggesting that
we change stuff around just to do that, but I also recognize it's not
a terrible thing.

> > > > but honestly breaking existing pages is the group's own fault.
> > > > Versioning was invented for a reason.
> > >
> > > Versioning is a cop-out solution that doesn't actually solve anything
> > > and introduces a raft of new problems, for example lack of backwards-
> > > compatibility with existing UAs, and the requirement that every user
> > > agent implement every past version of the spec as well as the current
> > > one, massively multipling the cost of implementation (requiring
> > > exponentially more test suites, exponentially more testing time, and
> > > resulting in exponentially more bugs).
> >
> > Technically it's geometrically, but hey exponentially has a better ring
> > doesn't it? Whether or not versioning, the cop-out (sounds like a movie
> > title doesn't it?), could be argued as necesarry or not, the system CSS
> > has been using hasn't been fairing much better.
> 
> Actually, it's been faring MUCH better. There is no way browser
> manufacturers could possibly keep up with the work required of the system
> you advocate. We're swamped as is, just implementing and fixing one
> version of each technology. We don't have the resources to be implementing
> and fixing three versions of each.

Well the idea is to let go eventually and leave old versions alone.
When bugs have been existant in a product for a long time, users of
that product learn to depend on them. Fixes then actually break their
page/product. Once you release a version that supports it, let it be.
Think of it as the "link rot" rule of development. Good libraries
don't change.

Orion Adrian

Received on Friday, 1 July 2005 12:53:54 UTC