Re: @media and browsers conditional statments

On Mon, Aug 4, 2008 at 3:07 AM, Francois Remy <fremycompany_pub@yahoo.fr> wrote:
> CSS-version should be the latest fully supported css version. Currently, it
> should be CSS2 for IE7, CSS2.1 for others browsers.

That's not true.  For instance, latest Gecko does not support @page,
display: run-in, white-space: pre-line, and probably other CSS2.1
features.  I don't think any UA "fully supports" CSS2 or CSS2.1 yet.

On Mon, Aug 4, 2008 at 6:40 AM, Alan Gresley <alan@css-class.com> wrote:
> Why should authors have to fix browser bugs?

They shouldn't have to, but historically they have had to.

> What are the problems with the current practice of forward compatibly that
> would require querying support for unknown or future CSS declarations?
>
> When I last tested IE8b1, Gecko 1.9, WebKit 3 and Opera 9.5, they were all
> handling most CSS OK.

The main problem is if vendors implement properties in an incomplete
or buggy fashion.  That *does* happen even today.  Someone just posted
a list of all the differences he found between the latest versions of
the major browsers in text-overflow: ellipsis support -- using
"ellipsis", not "-ms-ellipsis" and "-webkit-ellipsis" and
"-o-ellipsis".  The standard wasn't clear on what should be done in
some corner cases.  This problem isn't going to disappear unless we
have a perfect test suite *and* everyone makes sure they obey it,
which I don't think is going to happen.

Giving authors as much information as possible is a good thing.  Of
course lousy authors are going to misuse it, but they already can: you
can conditionally load code using JavaScript (or, for IE, conditional
comments).  You can also exploit weird behavior of particular
selectors.  The idea is to make it possible to cleanly and safely do
what can already be done (and is already done) in an unpleasant and
hard-to-maintain fashion.

Even if hopefully we never have an IE6 again, browsers will always
have subtle bugs.  Authors should have good tools to work around them
for when that happens -- hopefully rarely.

> Adding additional complexity to CSS (via sniffing) does not get around
> issues of flawed logic in CSS2.1, CSS3 or undefined behaviors. CSS has come
> along quite fine without it developing into some hybrid CSS/Scripting
> language.

The extra complexity is trivial even in an absolute sense, and even
more so when you take into account that it potentially replaces a huge
array of selector hacks and JavaScript.  (And selector hacks become
even harder to maintain as browsers become more standardized.)  It
doesn't get around issues of *old* buggy browsers, but it will help to
get around issues of *new* buggy browsers.  It's idealistic to suppose
that there won't be any ever.

Received on Monday, 4 August 2008 14:42:41 UTC