W3C home > Mailing lists > Public > www-style@w3.org > February 2000

Re: UI WD (compliant browser)

From: Tantek Çelik <tantek@cs.stanford.edu>
Date: Sun, 20 Feb 2000 23:48:11 -0800
To: "www-style@w3.org" <www-style@w3.org>
Message-id: <0FQ9009WWSBUVE@mta4.snfc21.pbi.net>
Matthew Brealey wrote:

>> It is true that unit-less values for left and top are not valid CSS, but
>> that
>> means the content/webpage is not compliant - not the browser.
>
> No. It means that the browsers are not compliant because the pages
> wouldn't exist if the browsers didn't tolerate them.

Non-compliance by indirection?  I understand the blame you are laying at the
feet of the browsers, but I'm not sure it is reasonable to call it
non-compliance. (Exception: CSS-1 section 7.1)

> The fact is that by releasing incompliant browsers, Microsoft and Netscape
> have created the problem. If Microsoft and Netscape had followed the
> parsing rules in the first place, none of these problems would exist.

Memories are short.

Especially internet memories it appears.

The long honored tradition of accepting liberally authored markup (what some
so affectionately call "Tag Soup") dates back to at least Mosaic.  At the time
(1994/1995?) I was distracted by other technological windmills, so someone
else will have to fill in the history.

> By releasing the incompliant product, they commit themselves to
> incompliant software forever. As a result of sloppy error handling there
> are now millions of pages, thousands of magazine articles, books, etc.,
> that tell people that top: 75 is a valid declaration.
>
> Furthermore, by allowing invalid declarations or tokens (e.g., font-size:
> "12px" or P.1), they encourage people to create invalid pages.

I think "encourage" is a bit strong.  "enable" perhaps.

> Far from
> being user-friendly, this behaviour is actually very dangerous. The number
> of users who say 'But it works in Explorer' is incredible.

And a few years ago the quote was "But it works in Netscape"

And a few years before that - "But it works in Mosaic"

Which came first?  Liberally written pages or liberally accepting user agents?

And once the liberally written pages proliferated, does anyone aiming to
provide a browser which supports that content have any choice but to accept
and attempt to do something with sloppy content?

There is not much that can be done about either the pages or the browsers out
there.  There is quite a bit that can be done about the pages and browsers
being written today.  I prefer to focus on the latter.

>> > , neither Netscape nor Microsoft will release a compliant browser.
>>
>> That statement makes the assumption that a browser cannot simultaneously
>> be
>> compliant and handle legacy uncompliant content.
>
> I would be interested to see how the example that I cited, for which the
> only compliant interpretation is to ignore it, can be handled

Compliancy can only be said to apply to valid/well-formed markup.

If the markup is invalid or not well-formed, the result is undefined*, and
whatever the user agent produces cannot be said to be compliant or
uncompliant.

Tantek

*At least in HTML.  On the other hand, XML (and thus XHTML), to its/their
credit, does do a better job at defining how to treat non-well-formed
documents.
Received on Monday, 21 February 2000 02:49:07 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 27 April 2009 13:54:04 GMT