Re: [Fwd: A Call to Reorganize WCAG 2.0]

> 
 > -------- Original Message --------
 > Subject: A Call to Reorganize WCAG 2.0
 > Date: Mon, 23 Aug 2004 07:56:08 -0400
 > From: RUST Randal <RRust@COVANSYS.com>
 > To: WAI <w3c-wai-ig@w3.org>
> 
 >      WCAG should be divided into Guidelines, which can
 >      be measured and tested, and Suggested Best Practices, which
 >      can only be tested by a person.

>From the wording of the above sentence I interpret "measured and
tested" as meaning "measured and tested by software". If this is
indeed the suggestion then the main problem with it is that so little
of what is needed can, in fact, be reliably verified automatically.
Whereas an XML parser can establish whether an XHTML document is valid
according to the DTD, it cannot determine whether the elements and
attributes have been used with the semantics defined in the
specification. Heuristics can be used to identify probably incorrect
markup, but the final judgment can only be made by a human evaluator.

If satisfaction of machine testable requirements were allowed as the
basis of a conformance claim, then this would be meaningless as a
statement about the accessibility of the content. To take the example
of guideline 1.1, all that can be tested in software is, at best, the
existence of text associated with the non-text content, e.g., an ALT
attribute or content inside a DESC element in SVG. This supposed "text
equivalent" could be entirely meaningless from a user's perspective,
yet satisfy the machine testable requirement. I could make my entire
Web site conform to this requirement without making it any more
accessible than the same Web site with no text equivalents whatsoever.
A similar point can be made with respect to many of the other guidelines.
 > 
 > The Guidelines should deal strictly with W3C Technologies, so that
 > vendors can be left to ensuring the accessibility of proprietary
 > technologies such as Shockwave and PDF. Vendor technologies can then be
 > addressed in the Suggested Best Practices. Other items, such as clarity
 > of content, should also move out of Guidelines.

At present the guidelines proper are not technology-specific; indeed
one of the principal reasons for working on WCAG 2.0 was to to remove
the implicit HTML-specificity of WCAG 1.0. If the above-quoted
proposal would involve re-introducing technology-specific requirements
at the level of the guidelines it would be a step backward from the
aims of WCAG 2.0 and would face inherent limitations whenever a new
technology emerged on the scene. Techniques for WCAG 2.0 are already
restricted to W3C technologies and the development of techniques for
other, proprietary technologies is already the responsibility of external
parties - so if the above proposal is construed as applying at the
technique level then it simply expresses the status quo.
 > 
 > I propose this because WCAG Guidelines must be measurable and
 > quantifiable. There can be no gray areas, otherwise it makes it too
 > difficult to make a business case for accessibility. The measurable
 > Guidelines must work entirely in concert with other W3C publications,
 > such as HTML, XHTML, CSS and DOM. Moving outside of the W3C realm only
 > causes confustion, frustration and, ultimately, ignorance of
 > Accessibility Guidelines.

But "accessibility" to identified groups of users is ultimately not
amenable to automated verification, though much of it can be reliably
tested by informed human judgment. If conformance claims are based on
the outcomes of machine validation only, then the result will be empty
conformance assertions which are true in the sense that all of the
machine testable requirements have been met, but largely worthless as
measures, in any reasonable sense, of the type of accessibility
attained. People will craft their content to satisfy the automated
tests, and not to make it more accessible to users. They will regard
"accessibility" as a discrete topic to be dealt with by ex post facto
testing and retro-fitting, rather than as integral to the design of
the content. Of course some might argue that such is the case already;
all I am suggesting is that dividing the guidelines along the proposed
dichotomy would aggravate the problem by allowing content developers
to use machine testing as the sole justification for a (valid)
conformance assertion.

 > 
 > The average developer can easily grasp HTML validation and its results,
 > but cannot easily understand the results of a BOBBY test. Accessibility
 > testing always results in ambiguous results that are confusing in some
 > aspects. All too often, the final decision on accessibility is left up
 > to human judgement -- which may or may not be accurate.
 > 
I suspect the average developer can't even understand the error
messages generated by an HTML validator, and doesn't even know what a
DTD/schema is. This is where the authoring tool guidelines come into
play by prompting for the necessary information as the content is
written, and performing various checks along the way.

Developers who understand the underlying technologies, and who are
either writing content directly or writing tools to create content,
constitute the most important audience for WCAG 2.0. Much more
emphasis should be placed on improving the tools with which content is
created, rather than on encouraging people who don't understand the
underlying technologies to apply guidelines which can't be properly
understood without a grasp of those technologies themselves.

Another strategy which I favour is to design technologies so that
simply by writing content that conforms to to the format
specification, most if not all of the semantics required to enable the
content to be automatically transformed into presentations adapted to
the needs of a broad variety of users, are thereby supplied.

Received on Tuesday, 24 August 2004 01:11:28 UTC