W3C home > Mailing lists > Public > www-style@w3.org > November 2007

Re: Proposal of @ua

From: Brad Kemper <brkemper@comcast.net>
Date: Mon, 26 Nov 2007 09:01:41 -0800
Message-Id: <D89FAA61-D7DC-434B-B774-94E97261FFD8@comcast.net>
Cc: www-style@w3.org
To: Stewart Brodie <stewart.brodie@antplc.com>


On Nov 26, 2007, at 3:20 AM, Stewart Brodie wrote:

>
> Brad Kemper <brkemper@comcast.net> wrote:
>
> [I've snipped this exchange considerably, as it was getting  
> extremely long -
> the previous versions are still in the list archives, of course]
>
>> In CSS, you should be much less motivated to lie. CSS is designed  
>> to skip
>> past the parts it doesn't understand, and authors don't need to  
>> branch
>> between major browsers for most of there code, but rather just for  
>> a few
>> lines here and there. The only time time branching is for large  
>> sections
>> of code is when dealing with lE. If you are using the Trident  
>> engine (for
>> instance, as I really don't know anything about Fresco) and you get
>> branched because the page asked if you are using Trident, then  
>> this isn't
>> a problem. Everyone else gets the standards- based code. No need  
>> to lie so
>> far.
>
> All of our layout and rendering engines are developed in house.   
> However, I
> am more concerned with Galio, the product that I work on, rather than
> Fresco.  Fresco is supported, but largely in maintenance mode only  
> nowadays
> and does some DOM0 and only a small amount of CSS1.  Galio, on the  
> other
> hand, does do the vast majority of DOM Level 2 and CSS 2.1.

My mistake. Please substitute "Galio" wherever I said "Fresco" then.

> We'd be
> absolutely delighted if servers would send us standards compliant  
> content.
> Sadly, not all do.

My point here was that those that do so are not using CSS to bypass  
you, through some hack focused exclusively on FireFox, for instance.  
That is not how people write CSS. If there are some that are not  
allowing you to join the party, then they are likely doing so with  
server detection to exclude UAs they don't know about, and will  
continue to do so. If they are using IE conditional statements to  
exclude you then they are excluding all non-IE browsers, and you will  
still be unaffected negatively by any new @ua or ua media query that  
you gave an honest answer to.

>> In other words, I should not want UA detection because getting the
>> inappropriate rule for your browser is not that bad, but I should
>> also not want UA detection because if it isn't perfect then you might
>> receive an inappropriate rule for your browser. The worst case for
>> honestly reporting your rendering engine in CSS (getting the wrong
>> rules) is no worse than not having any detection (and getting rules
>> that don't into account the idiosyncrasies of your software). Whereas
>> the best case is that problems a layout may have with your browser
>> can be corrected by adding a rule for your engine that does not risk
>> messing up other UAs.
>
> I agree with the best case.  I don't agree with the worst case  
> necessarily -
> it depends what the rules that get sent are.  If it's just "I don't  
> know
> about you, no CSS for you!" then that's not useful.  A complete  
> loss of
> browsing experience is not acceptable (for our customers and their end
> users), particularly on major websites.

True, that would be worse, but unlikely (see my first point, above).  
Authors have no motivation to start to write CSS like that. If they  
would do so (it would be stupid, but 'm sure there are some who  
would), then they could already using IE conditional comments to  
exclude all non-IE UAs, so this proposal would not change that. It is  
far more likely that they would would say "I don't know about you,   
so no regular HTML/JavaScript for you". In which case Galio would  
never see the @ua rule anyway.

>> I do not deny that [browser sniffing via User-Agent] is often used
>> inappropriately, and that when it is, it can be especially  
>> infuriating.
>> Yet if there are some special issues that a site might have with  
>> Fresco
>> that cannot be dealt with via simple object detection in  
>> JavaScript, then
>> how else can the author deal with them without detecting your UA?
>
> I know that there are some websites that do have specific browser  
> detection
> for Fresco, primarily in the early days of delivering home shopping or
> Internet banking via TV.  We're talking mid-1990s here, so ancient  
> history
> as far as the WWW is concerned.  I do not know of any that have  
> specific
> detection for Galio, and they really should not require any.
>
> I think that we agree that browser sniffing is, in general, not a  
> good idea.
> I think the primary difference between us is that you have faith  
> that a
> UA-detection facility would not be abused (too much?) and I do not.

I think the primary difference is that I think UA detection in the  
CSS layer would not make things worse for you, and could make things  
better. No worse because badly written, exclusionary UA detection is  
already happening at a deeper level and this won't change that.

Honestly, I imagine it would be more likely to have a positive effect  
for IE, Gecko, Webkit, and Opera, since that is who people are  
writing CSS hacks for mostly these days in order to try to ensure the  
best rendering for the widest numbers.

The way those hacks are being written today is non-exclusionary. I  
don't believe that would change much if authors were given less- 
fragile tools to deal with special cases  as the hacks become less  
and less effective. That seems to also be a point where you and I  
believe differently.

I also think that the sort of author that would use advanced CSS3 is  
more likely to be aware of the issues and write that CSS in a way you  
would not object to, and to set a good example for future authors.

I also believe any tool can be used for good or for evil. Just  
because an implementor cannot guarantee that it would never be abused  
is not a good enough reason to withhold a tool that is needed by so  
many, so urgently.

On why we need it: If I were designing magazine ads and someone told  
me that a fourth (or even an eighth) of the people seeing it would  
see a very messed up version of my design, then I think it would be  
obvious that there was a problem that needed correcting, perhaps by  
identifying the printers whose presses operated differently, and  
sending them a modified file that dealt with that. I realize that  
programmers often don't have much of an appreciation for what  
Marketing folks do, but I thought this group might have a better  
appreciation of the importance of trying to have consistently styled  
pages.

>> Even if I as an author did decide to branch to 2 separate files,
>> using @media, it is very likely to look something like this:
>>
>> @media screen and (renderer: trident) { @import url(http://
>> www.example.com/my_styles_ie.css)  }
>> @media screen and not (renderer: trident) { @import url(http://
>> www.example.com/my_styles.css)  }
>>
>> If I learned about the feature by reading about it, then I would
>> likely have also read that this is the preferred way to use it. If I
>> learned about the feature by viewing the examples of other people
>> that had read about the preferred way to use it, then I also would
>> have a good example.
>
> Yes, I agree.  Provided you can avoid counter-examples appearing on  
> sites
> like quirksmode and alistapart, good style should be infectious.

I'm not sure what you refer to on those two sites. I don't want to  
sidetrack the conversation, especially on a point of agreement, but  
those sites do not advocate exclusionary tactics. The UA-detection  
script on quirksmode is offered only as a last resort for when  
JavaScript object detection fails to be enough (such as when a needed  
object exists but doesn't do anything).

>>> How do determine which browsers are the "major browsers" anyway?  
>>> Can you
>>> trust the server's agent log stats given that most browsers are
>>> compelled to lie about what they are?
>>
>> I trust them well enough. I use what information I have available to
>> me.  I look at my Google Analytics, which I believe does its best to
>> parse the unique parts of the User-Agent identity strings. The
>> percentages are not that far off from what is reported for general
>> populations in Wikipedia, for instance.
>>
>> I have to prioritize and concentrate my efforts where I see the most
>> need, and one of the big 3 or 4 that account for 99% of the traffic
>> comes out with  a new version then I know that more and more of the
>> thousands of page views on my company's sites will be using the newer
>> version.
>>
>> I suppose its possible that ANT Fresco is actually 25% of our traffic
>> and that its User-Agent identity strings are just very well disguised
>> as FireFox or something, but if so, well, that's what you get.
>
> It's highly unlikely that it accounts for 25% of your traffic :-)  I
> certainly wouldn't do that with Fresco, as it is not at all like  
> modern
> versions of Mozilla.  Galio's user agent is configurable so our  
> customers
> can change it to whatever they like, but by default it is very simple:
>
> Mozilla/5.0 (compatible; ANTGalio/<version number>; <OS identifier>)
>
> However, we don't go out of our way to hide the browser identity.   
> Although
> navigator.appCodeName is Mozilla, like it is in so many browsers  
> nowadays,
> navigator.appName is "ANT Galio".  Again, all of these strings can be
> configured at run-time, so I can't guarantee that people won't  
> change them.

Insomuch as we are talking about UA strings vs. the relative accuracy  
of logs or analytics, I don't think we have any substantial  
disagreement on this point, so its OK with me if you wish to clip  
from future responses.

>> I did not mean to slight your company or Fresco in any way, just  
>> because
>> it seems to represents a less significant portion of the traffic  
>> to my
>> company's site.
>
> I know you didn't and I didn't take it that way at all.  I was just  
> making a
> general point that statistical analysis of user-agent string  
> frequencies is
> vulnerable to the same sort of manipulation as things like Google page
> ranks, where the data being analysed deliberately misrepresents  
> itself in an
> attempt to obtain a more desirable server behaviour.

I agree that it introduces a margin of error. Other evidence, from  
the calls to our call center, for instance, backs up the idea that  
IE, FireFox, and Safari are the 3 most-used browsers/UAs on our site.  
I look at the stats for general trends, and take the results with a  
grain of salt.
Received on Monday, 26 November 2007 17:02:04 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 27 April 2009 13:54:56 GMT