W3C home > Mailing lists > Public > w3c-wai-ig@w3.org > October to December 2004

Re: accessibility at risk on commercial sites:

From: david poehlman <david.poehlman@handsontechnologeyes.com>
Date: Thu, 21 Oct 2004 17:09:33 -0400
Message-ID: <014101c4b7b2$4420f9d0$6401a8c0@DAVIDPC>
To: "David Woolley" <david@djwhome.demon.co.uk>, <w3c-wai-ig@w3.org>

Dave and all,

It is the secure side of things that most distresses me.  Unfortunately, not 
all user agents are easily hackable and it would distress me to have to do 
that in order to carry out my activities anyway.  I don't think there'd be 
much if any added cost if developpers of commercial sites changed the 
behaviour in regard to the information they recieve from the requestor. 
Instead of flat out refusing, they could let you in but warn you that the 
site might not behave as expected.

Johnnie Apple Seed

----- Original Message ----- 
From: "David Woolley" <david@djwhome.demon.co.uk>
To: <w3c-wai-ig@w3.org>
Sent: Thursday, October 21, 2004 4:54 PM
Subject: Re: accessibility at risk on commercial sites:

> I hope this finds its way to those who can help with it.  It seems that an
> alarmingly growing number of ECommerce establishments are denying access 
> to

That's not new.  It used to be a standard question on the Lynx mailing
list as to what fake User Agent string was the best compromise between
being honest and not looking like anything other than IE to browser
sniffing.  In fact, it is about as old as IE, as IE uses a bogus
User Agent string format, in which it claims to be Mozilla (then, as
a pure comment, reveals its real identity).  I believe this is
because of discrimination in favour of Netscape in earlier years.

I believe most other minority browsers pretend to be IE (i.e. IE
pretending to be Mozilla) out of the box.

Probably the most subtle reason for this would be keyword stuffing
of search engines - if you are unknown, you may be a search engine,
so get the keywords.  Another subtle reason might be that you are
assumed to be an automated page fetcher - advertising paid portal
sites don't like these because large amounts of information can
be processed without a human seeing the adverts.

Generally though it is a combination of the 80-20 rule (you only need
to support 80% of the market to be commercially successful - and you
shouldn't support more as it is more profitable to create new products
instead), and a continued presentational view of the "web", which means
that predictable appearence is the most important characteristic of
a site.

In many cases, it may not actually be that IE and Mozilla are selected,
but that the browser capabilities database in the server says that they
support all the features used in the site design (e.g. Lynx will be rejected
if the code says must have frames or must have JavaScript).  This is
really the same problem in a different form.  In the past, I think this
is why Lynx has been rejected.

A special case probably applies to secure sites, in that there is,
theoretically, some one you can blame, and sue, if there are security holes
in the browser or its SSL implementation.  (At one time Lynx didn't
authenticate the site, not that that helps with many sites, as they don't
have authenticated URLs that match the business, and nobody checks URLs
or certificates anyway.)
Received on Thursday, 21 October 2004 21:08:02 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 13 October 2015 16:21:30 UTC