W3C home > Mailing lists > Public > www-dom@w3.org > October to December 2005

Re: [dom3core] getAttribute

From: Ray Whitmer <ray@personallegal.net>
Date: Tue, 6 Dec 2005 03:39:01 -0700
Message-Id: <C3381B78-5CDA-448D-8DC1-6EDB5C2DA095@personallegal.net>
Cc: www-dom@w3.org
To: Brendan Eich <brendan@meer.net>

On Dec 5, 2005, at 7:47 PM, Brendan Eich wrote:

>
> Hi Ray, a quick note to talk about the case you show that doesn't  
> work as expected in Firefox:
>
> <p id='foo'></p>
> <script>
> alert(document.getElementById('foo'));
> </script>

I appreciate the attention to this.

> This source, taken as an entire text/html document, causes Gecko to  
> move the script to the implicit head and leave the p in the  
> implicit body.  That makes the document.getElementById call become  
> a forward reference, which returns null since when the relocated  
> script in the document's head is processed, there is not yet any  
> element with id 'foo'.  This is all done to emulate IE.
>
> Giving the p some child content causes the implicit body to open  
> before the p, so the script is not moved to the (still-implicit)  
> head.  Again, IE compatible.
>
> If you use an explicit body tag, all works as expected.
>
> One more (lengthy) comment: a lot of what browsers do to gain  
> market share and hope to lead the web toward better standards  
> conformance is necessarily temporizing.  You might call it a deal  
> with the devil, but without it, there is no way forward because  
> adoption is strictly limited based on lack of backward compatibility.

Answering the comment, sometimes, as we see with the getAttribute  
problem, it has nothing whatsoever to do with backwards  
compatibility.  getAttribute was implemented correctly according to  
the standard in the Mozilla code base at first.  In fact, the  
standard was written the way it was specifically to cater to the  
Netscape/Mozilla code, which at the time was not able to handle a  
string return occasionally being required to return a null.

Then, after adoption, IE unilaterally changed it, and Mozilla  
followed, with "fixes" to follow IE even though the standard was in  
place.  I understand why it occurred, but the reality is different  
from supposed "backwards compatibility".

> Adding UI preferences, even with the right (user-oriented, "bugward- 
> compatible") default, won't help, since most web developers won't  
> set the pref the other way when developing (assuming they even use  
> your browser), and if they did, their content might still break in  
> almost all browsers.

You should explain how this would occur, using the preferences the  
way I described them in a previous message:

(from my previous message on the topic)
> Example of radio buttons:
>
> (1) * IE (default for almost users, let's label it as it is, IE)
> (2) * Standards (claim 100% compliance if only the content were, not
> used unless IE-incompatible content is found)
> (3) * IE Standards Subset (non-treacherous for use by web authors so
> they produce content that is both IE and standards compliant)
>
> Users use the mode (1) if IE compatible, non-standards-compliant is
> the rule and they never need anything else.
>
> Web authors use mode (3) to test, which is designed to yield content
> that always works under mode (1) while being standards compliant.
>
> Mode (2) is only needed by test suites or content that is otherwise
> broken in IE.  If there is content that needs it, you otherwise had
> no way to process it so having to flip a UI seems marginally better
> than nothing.  Not a great solution, but in several ways superior to
> the status quo.
(end of self quote)

How does this mode for web authors produce code that "might break in  
almost all browsers"?  Mode 3 was proposed as an option that produces  
code that should work in almost all browsers while also complying  
with standards. Only if it is not correctly implemented would it  
produce code that "might break in almost all browsers".  Please explain.

> Even if IE7 were to fix this implicit head bug, for example, and at  
> great risk of lost backward compatibility and therefore of lost  
> market share with some (probably "Enterprise" or otherwise large,  
> institutional) customers, the sunk costs in content creation spread  
> across the web and private networks will not be un-sunk, and the  
> content will not be re-authored, until IE7 and other browsers with  
> such a fix showed adoption at or above a significant threshold --  
> or until entirely new content is written for the same purpose, for  
> unrelated reasons.  While it would help if this were to happen, it  
> does not appear likely to with IE7.

(I hope I remember this correctly, as it was related to me) MIT  
developers tell of the dilemma they faced when discovering that they  
really should fix the design of their make utility, because they  
already had 8 users. [[insert laughter at the silliness of not being  
willing to break 8 users for all those who use make today]]

Only offering users the status-quo treacherous test that reinforces  
use of broken behaviors cannot possibly help.  Without offering the  
prisoners a guide out of their dilemma, all we will have is more  
dilemma, whoever is in charge.

The whole proposal for a strictness of enforcement user-selectable  
mode allowing web authors to have a less-treacherous testing tool is,  
admittedly, off the top of my head and poorly defined.

Yet in this case it is quite clear that with knowledge of this sort  
of issue, mode three would hold the user to the form where there is  
common ground between IE and the standards, i.e. require a body tag,  
which would give positive reinforcement for authors to produce code  
that both follows the standard and is functional on existing browsers.

You have yet to offer any evidence (which I do not doubt exists, but  
should be weighed) of the problems with this sort of option or  
something that could be derived from the idea.

> Any browser with much smaller market share than IE faces even  
> harsher trade-offs.  This is a variation on the Prisoner's Dilemma  
> (it would be pure P.D. if all browser vendors were willing to  
> cooperate).  Without meaningful cooperation, instead of "defection"  
> to the status quo and backward compatibility as we all try to gain  
> market share against the dominant browser, it is not possible to  
> advance the state of standards conformance in many ways in the near  
> term.

For behavior change, you need feedback that reinforces the type of  
behavior you wish to reinforce. In this case, if even 10 percent of  
the web authors initially trusted this guidance to produce content  
that were both standards and status-quo compliant, it would be a  
start, and you might find that many more web authors find it to be a  
good test environment, less treacherous than the alternative.

Standards evangelism becomes much easier if you can actually show the  
way with the browser instead of just pointing to documents you are  
not even following. With this sort of option, you do not have to  
break IE compatibility.

> But with more market share for standards-friendly browsers based on  
> their merits *independent of standards conformance*, and so long as  
> they "don't break the web" (i.e., "work like IE, or Mozilla if it  
> is handled correctly by content"), then there is hope.

It is completely hopeless without some browser reinforcement of good  
behavior by web authors.

> The trick is to get enough market share, and thereby co-evolve new  
> interoperable, standard, and *actually used* content languages with  
> "sugar on top", so that over time, new content based on the new  
> standards comes to eclipse the old content that ties our hands  
> right now.  This is far from a sure thing, but it is conceivable.   
> In fact, Gecko has eliminated a few bugward-compatible quirks over  
> its history.

Back to the reality of the issues in question here:

In the case of WRONG_DOCUMENT_ERR (quoting from the recent  
communication on the Safari browser):

 >> Maciej, what particular websites are these?  The reason I ask is
 >> that I was under the impression that IE/Windows actually throws
 >> WRONG_DOCUMENT_ERR as specified; if it does, it's rather odd that
 >> sites are doing things that would cause it to throw...
 >
 >I tested, and it does appear to throw.
 >
 >I think the sites where we ran into this were ones that had separate
 >IE and Mozilla code paths, and Safari ended up in the Mozilla code
 >path and therefore had to follow the Mozilla quirk. Given this, we
 >would probably fix it in Safari if was fixed in Mozilla. Originally
 >we had the conforming behavior on this. But there's a lot of sites
 >out there that think there are exactly two browsers in the world.

As you see, this case has nothing to do with minority market share.   
The problem existed because Mozilla had too much market share.   
People expected Mozilla, which is apparently not following the  
specification, to behave differently from IE, which appears to  
already be following the specification, and are special-casing  
Mozilla.  I do not find it credible to blame all the woes on lack of  
market share, because frequently this turns out to be not the case  
after all.

The DOM specification was created by the browser vendors.  In many  
cases, it is just a matter of following it more carefully, even  
implementing it correctly the first time.

> What is not likely is a "brand new web" built beside the old one,  
> on top of new, undertested /de jure/ standards.  Evolution by  
> incremental change, with lots of web compatibility, is clearly the  
> only viable growth path for minority-market-share browsers.

If we were creating a new set of de jure standards, they would be  
very different than what we have today, which was merely an attempt  
to codify what existed and define extensions that would be  
implemented in a standard way through consensus, which clearly  
existed in level 1 of the standards process that defined the issues  
in question.

I would not expect anything but incremental improvement.  If we can  
just keep unjustified incremental degeneration of the sort that  
caused these problems from occurring, the web changes every day and  
will improve.  But this requires more effort, as we see, andf not  
just from those with current majority market share.

I understand calculated decisions to not follow the de jure  
standard.  Ignorant breaking of the standards is less excusable.  I  
have suggested approaches that would be compatible with existing  
browsers.  I only suggest more helpful leadership on standards from  
the vendors and less jumping to false conclusions about the past,  
which results in false lessons for the future.

I love the Mozilla project and hope it succeeds, but only by behaving  
better than the alternatives.  Virtue is not automatically discovered  
with market share, and claiming helplessness due to minority status  
is often not credible.  Browser vendors who really want a better web  
need to reinforce the better practices now.

Ray Whitmer
Received on Tuesday, 6 December 2005 10:39:46 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Friday, 22 June 2012 06:13:58 GMT