W3C home > Mailing lists > Public > w3c-wai-ig@w3.org > April to June 2002

Re: Re: Accessibility problems with Blackboard?

From: Martin McCormick <martin@dc.cis.okstate.edu>
Date: Mon, 29 Apr 2002 11:51:41 -0500
Message-Id: <200204291651.g3TGpfl70186@dc.cis.okstate.edu>
To: w3c-wai-ig@w3.org
	I read the message about Blackboard along with another message in
this list regarding the BBC  television site and a special server
that apparently renders the site in standard html.  I don't yet
know if the BBC site is updated manually or if it serves running
content based upon the main site, but if the latter, it
underscores a point I want to make.

	The message I am responding to lists a substantial number
of technical problems which may or may not be related to
how Blackboard is locally implemented.  Solving these problems
will and has obviously already taken up a number of people's
time, not to mention the money spent on JAWS, etc.  It's still
broken.  This is 2 days away from the first of May, 2002 and
those students who haven't dropped out of the classes in question
are going through a very frustrating time, I am sure.

	The fact that they need sighted assistance to use the
course material after installing JAWS is beyond absurd.  It is
pathetic.

	I am not bashing Blackboard or any other vendor right
now, but don't get me started.  I am bashing a way of thinking
which is repeated so often that it almost sounds like the truth.

	This way of thinking says that access will happen when
the next version of JAWS, IE, JS, SGML, HTML, NN, [A-Z][a-z], etc
comes out and we run it on OS version N+1 which requires a new
mother board and an upgraded software maintenance agreement that
gives over the first-born male child and the vendor's choice of
any other subsequent children, etc.

	Then, it comes to pass that all these things happen and
things are still broken.

	There is the idea that if we throw more and more complex
solutions at a chaotic problem, we will some how miraculously fix
it without anybody having to plan for anything.

	By vocation and avocation, I come from an electronics
technology background.  Solving problems and fixing things has
been both what I do for fun and how I earn my living.  Actually,
now, I earn my living by working with UNIX systems and making
them provide my employer with domain name service and also
monitor parts of our network to make sure it is in good health.

	The Internet is based upon a 7-layer functional model
that exists more in theory than practice, but the model is sacred
to standards-based product vendors and those who want to build
open-source applications that talk over a network.

	The effect is that hardware and software makers know that
they might as well forget it if their device or application won't
work over the Internet.  TCP/IP is praised and cursed constantly
and there are no end of bright folks who would rip it all out and
most likely make all the same mistakes again that the original
ARPANET founders learned the hard way, but the fact is that the
network side of things works pretty well most of the time and
very well much of the time.

	It is all because of standards and a lack of chaos at the
core level.  Random things do occur, but even that is anticipated
as much as one can.

	There is discussion on whether lynx and other no-script
browsers are obsolete.  I don't think obsolete is the right word,
but they are not compatible with a large number of web sites
these days.  The more I learn about what it would take to equip
lynx with javascript, the more daunting the task seems.

	We recognize the existence of WAP for PDA'S which have
tiny screens and much of the reason for that is that users just
want the text, thank you.

	Why do we not simply recognize that standard html coupled
with lynx or any other html engine is a legitimate access
solution?  It is already here and has been so for years.  Servers
could deliver their bleeding edge content to anybody who thinks
they can use it and then automatically degrade to html or WAP if
the remote host says that's all they can handle.  What is wrong
with that?  At least one could tell right off if it was going to
work.  There would be times when the translation engine probably
couldn't deliver content, but it beats all the bad options we
have now.

	There isn't a system big enough or fast enough to ever
work under the present model because the issues will always be
one step ahead of the solutions.  Repeat after me.
"retrofit bad.  Planning and standards good."

Martin McCormick WB5AGZ  Stillwater, OK 
OSU Center for Computing and Information Services Network Operations Group
Received on Monday, 29 April 2002 12:53:15 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Tuesday, 19 July 2011 18:14:04 GMT