W3C home > Mailing lists > Public > www-talk@w3.org > January to February 1996

Re: Microsoft IE -- it just gets better and better

From: David Ornstein <davido@apocalypse.org>
Date: Sat, 27 Jan 1996 11:16:35 -0800
Message-Id: <2.2.32.19960127191635.00ab2a24@objarts.com>
To: BearHeart / Bill Weinman <bearheart@bearnet.com>
Cc: kaimuse@apocalypse.org, brianp@apocalypse.org, paulk@crocker.com, mwm@contessa.phone.net (Mike Meyer), www-talk@w3.org
At 12:23 PM 1/27/96 -0600, HearHeart wrote:
>At 12:41 am 1/27/96 PST, Mike Meyer spake:
>>>    User-Agent may not be the most technologically whiz-bang thing
>>> you can think of for content-negotiation, but it works.
>>If it works, then why did MS and others feel the need to use it (*)
>>the way they did. The fact is, it DOESN'T work. The group doing the
>
>   It works for what it's good for: I can have two versions of 
>my Site: one for Netscape and one for Everyone Else. MS is trying 
>to force me to have three--and I don't want to. 

In fact, I've got half a dozen [1] (generated by tool, thank you very much).
And  I don't mind having that many.  I've thought a lot about the advantages
and disadvantages (e.g., caching proxy servers, etc.) and it's been a very
workable, practical solution *given what's available today*.

>   When the content-negotiation features of HTTP/1.1 become 
>more than a set of words in a draft, I'll gladly use them. 

Agreed.  In fact, I very much look forward to such.

>>(*) They used it - they did the thing that gets the best results for
>>THEIR users. 
>
>   <sigh> But it doesn't get the results for their users. What 
>it gets them is more confusion. When they were identifying themselves 
>as "Mozilla/1.22 (compatible)", I believed that was their purpose. 
>
>   But "Mozilla/2.0b3", without "MSIE" in any part of the string is an 
>outright lie designed to break the system. They don't support ANY of 
>the Moz 2.0 features--so what are they trying to do besides break 
>the system? 

I, too, can't think of any other possible reason (except maybe that it was a
simple act of someone who didn't really know what they were doing -- which
seems unlikely).  But anyway, you should finish your thought:  "<sigh> But
it doesn't get the results for their users"...  Indeed it doesn't.  What do
you think Microsoft's customers will say when they point their new Mac
browser at a site that does User_Agent recognition and they get garbage?
Seems like they'll be pretty unhappy.  I'd be.

>   Their making a configurable option for the user to put in whatever 
>string they want is a mockery of the system and of all the efforts 
>of this volunteer group. 

And others.  Right now the BrowserCaps project [2] has begun to get some
wonderful garbage.  I noticed (and removed) and entry from someone using a
browser that claimed (in User_Agent): "Billy Boy Let Me Put This Here"...
Yeah, that really helps us all.  I guess I'm going to have to do some
regular cleaning by hand of the database.  A real pain in the a**.
I presume similar things are starting to happen to, for example, the HTML
Forms Capabilities Testing site at Digital [3].
 
>   Add to that their refusal to participate in the content-negotiation
negotiations and their intention becomes clear: To force their way 
>into the market and into the position of unilatterally setting the 
>standards. 

Yup.

>   MS is trying to force me to provide a separate set of content to 
>their browser. Ant they're holding all my Netscape users hostage 
>for it. 

I'm not quite sure that's true.  Netscape users should continue to get what
they want from a site that does negotiation based on User_Agent.  It's the
IE users that may get junk.  Seems to me that Microsoft is gambling that
that will be a large enough audience for we publisher that we'll have to
respond when they complain.  Sure seems like a big gamble.  As a
businessperson, I wouldn't take that one...

>>or convince thousands of webmasters to fix their
>>software.  
>
>   The implication that their software is broken is short-sighted. 

I agree.  I think that there's a strong tendency for those who are working
hard to advance the state of the art to dismiss pragmatic solutions.  I've
chaired a standards committee and I know what it's like when you understand
issues completely and others don't seem to get it.  But experience in the
business world has taught me that its necessary to balance a strategy of
continuous improvement with actually responding to customer needs (in the
short term).

>>I've been seeing a lot of snake oil on the web lately, and I've always
>>considered content negotiation based on user agent as such.  Convince
>>me I'm wrong, and that you're successfully negotiating content based
>>on user-agent. Tell me how you treat emacs-w3? IBrowse? Charlotte?

Since you ask:  I have a set of pages [1] designed for each of four or five
major browsers, plus a page set that is as close to standard as I can get
it.  This last page set is designed the way most good designers do it now:
use as many features as possible from the standard, minimize use of vendor
extensions and ensure compatibility at the cost of features.  BUT: the other
pages go all out and take advantage of many subtle variations in the
browsers in question.  Users are given the option to pick a set of pages to
view (stored in a long-term, changable preference), but the system make a
strong suggestion to them based on the User_Agent they are using.  This
system treats most users very well (as most users are using the major
browsers) and everyone else fine.

Some people have argued that it's impossible to keep track of what different
browsers do with the various HTML constructs.  After trying my pages on
about eight different browsers, I realized (like many) that it was hopeless
to try to test on everything.  But I didn't think it was OK to leave behind
those people using browsers I couldn't test with.  So I built BrowserCaps
[2].  Using BrowserCaps, I've collected (and make publically available)
information about 67 browsers (as identified by User_Agent) and 64 aspects
of HTML.  With the help of the net, it is possible to achieve pretty good
results.

(BTW, *please* stop by BrowserCaps and vote with your browser!  It only
takes ten or fifteen minutes and you're doing everyone a big favor.  I've
just added a bunch more tests -- including server push, courtesy of BearHeart).

That said, I look forward to the work being done on "real" content negotiation.

[1] See Outbreak, my emerging diseases site at
http://www.objarts.com/outbreak-unreg
[2] BrowserCaps, a collaborative database of browser HTML support
[3] http://www.research.digital.com/nsl/formtest/home.html
-----------------------------------------------------
David Ornstein
Outbreak: http://objarts.com/outbreak-unreg
BrowserCaps: http://objarts.com/bc
Personal Info: http://objarts.com/davido
Received on Saturday, 27 January 1996 14:16:50 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 27 October 2010 18:14:19 GMT