Re: FW: IE4.0 and W3C Standards [long, probably pointless]

On Mar 5,  2:03pm, Benjamin Franz wrote:

> On Wed, 5 Mar 1997, Chris Lilley wrote:

> > That may be the public perception but the reality is that W3C
specifications
> > represent an interoperable consensus among our members. We have 160, by
> > the way, and you only named two. By no means are all suggestions rubber
> > stamped. Some experiments fail, and sometimes W3C members have to leave
> > these behind and move to new ways on which there is consenus.
>
> You implicitly admitted that at least some proposals *are* rubber stamped,

Sorry for my imprecise wording. By no means is it the case that "all propsals
are rubber stamped".

> although this may not be what you meant to say. How many major proposals
> from MS or NS have arrived with the 'we are already implementing this in
> our browser' seal of approval? How many of those were ultimately rejected
> by the W3C? I don't want to know what they are: just the corpse count.

Heh. You know I can't say, and that anyway "more than 10 and less than 100"
would be horribly imprecise even if I were to say it.

> Less so than MS saying 'We are following the lead of the W3C on a
> proposal which which we brought to the W3C and are pushing.' It is
> semantically equivalent to saying 'we are following the lead
> set by ourselves'.

I see your point, but to get accepted for work within W3C it is not
enough to propse - getting agreement is also required. So it comes
down to 'we are following the lead set by ourselves which the others
have agreed to pursue'

Proposals which are in the process of being implemented cary a potential
reward at cost of some risk. The reward is obviously implementation
experience and early time to market and PR for being 'innovative'. The
risk is the need to modify the implementation to track how the W3C Draft
develops (if it gets accepted) and the risk that the draft might not
make it to a Recommendation. There is some benefit to W3C in taking
proposals which are demonstrably implementable, of course, as opposed
to things which sound great but are not actually implementable.

> > A definition which
> > depends on backwards engineering is fragile; plus backwards engineering
> > is hard if you want interoperability and are aiming towards 100% bugwards
> > compatibility with some proprietary feature. The first 75% is often easy
;-)
>
> Versus HTML-3.2, CSS1, Client Side Image Maps, or OBJECT, *as actually
> implemented*? ;-)

I accept that HTML 3.2 can be seen as reverse engineering in that it was
a first stake in the ground for W3C - a definition of existing practice,
where we start from. On the other hand, classical reverse engineering
does not in general have access to those who engineered the original
wheras the W3C did.

CSS1 has been in development for several years - the work started at
CERN - so no, that wasn't reverse engineering in any sense. Obviously
the companies and individuals who implemented a moving target took
some risk. Since November the CSS1 spec became a Recommendation and
includes a definition of conformance, so now we can expect
implementations to align themselves to this definition.

Client-side imagemaps were a proposal from one of our other members ;-)
and I accept that some clients do not implement the full thing. Send
them bug reports.

Object is at this stage just a draft, it has not been included in an HTML
revision. When that happens (look out for announcements) we can again
expect to see early implementations migrate to fuller compliance.

> Substantial sections of
> 3.2 only work in NS or only work in MSIE or don't work in *either* as
> spec'd


If you have specific test cases, I encourage you to send them in to
browsercaps if you haven't already done so. I would also appreciate
a copy of your test cases. And of course, send the bug reports in to the
respective companies.

> Ditto for CSS1,

There are not to my knowledge any CSS1 compliant browser implementations
yet, although I have seen some pre-release software that comes close.
This is just a matter of time, in the order of months rather than years.

Both Microsoft and Netscape have products that implement substantial
parts of the CSS1 spec (and also of the positioning draft). They are
to be congratulated on this, and I look forward to seeing fuller
implementation and tighter conformance as these products develop.
Other browsers also implement parts of CSS and (seemingly unlike
yourself) I do consider these implementations important.

The number of content authoring tools that can generate and edit CSS
stylesheets is also increasing, which is a development I aam naturaly
pleased with.

> CSIM
> To tar NS in this mess as well - they have rendered CSIM's of only
> marginal use to me because of a combination of bugs. Essentially - forget
> external references to CSIMs. NS won't do them and disables even the
> *SERVER* side imagemap if you try.

I believe that the Spyglass browser implements this fully.

> and OBJECT.
>
> OBJECT is particularly heinous in MS's mis-implementation - they managed
> to destroy the usability of OBJECT *COMPLETELY* for anything but ActiveX
> objects. If I were to try to use it for anything except ActiveX - I
> would hang 25% of the browsers visiting my site (not to mention the
> 'insecure activeX object notices'). You can't even embed a JPEG using it
> (I tried).

So, you mean that folks visiting, say,

  http://www.w3.org/pub/WWW/Graphics/PNG/Inline-object.html

will get browser crashes? Interesting, but don't tell me - tell the browser
makers.

> Any web author can tell you that the published specs of the W3C are useful
> primarily for running syntax validation on your documents. It is nearly
> useless in terms of knowing what will actually be understood by a browser

I would not be quite so pessimistic. But wait, first you slam W3C for
"rubber stamping whatever MS and NS do", then you slam W3C specs for not
being descriptions of exactly what NS and MS do.

You can't have it both ways.


> *NEVER* implement solely from the spec - I always implement according to
> spec

Well that is good to hear. Where would you be without a spec to work from?

> - but then check against every variant of NS and MS I have (Mac,
> Unix, Win, v2,3,4) and Lynx to make sure nothing wierd is happening due to
> partial and broken implementations.

All fine and good. If something breaks, you can then confidently point the
finger. "You don't conform to section 3.2.7 of spec X" rather than company A
saying "B does it wrong" and company B saying "A does it wrong".

Of course, I assume by wierd you mean that things don't work as the
spec says rather than things are not pixel for pixel identical.

> I hear a LOT of hype about open standards from
> both - but when push comes to shove: proprietary is what I see roll out
> the door and onto the web.
> It is not a slam on the *programmers* - it is a slam of the *corporate
> policies* involved.

A useful distinction, and probably harder to influence. But still
possible. Consumers voting with their wallets, with good or bad PR, does
indeed affect corporate policy.

-- 
Chris Lilley, W3C                          [ http://www.w3.org/ ]
Graphics and Fonts Guy            The World Wide Web Consortium
http://www.w3.org/people/chris/              INRIA,  Projet W3C
chris@w3.org                       2004 Rt des Lucioles / BP 93
+33 (0)4 93 65 79 87       06902 Sophia Antipolis Cedex, France

Received on Thursday, 6 March 1997 12:36:27 UTC