Comments regarding Developers' Guide to Features of Web Accessibility Evaluation Tools - W3C First Public Working Draft 24 July 2014

Comments regarding Developers' Guide to Features of Web Accessibility
Evaluation Tools - W3C First Public Working Draft 24 July 2014



Reference URL: http://www.w3.org/TR/2014/WD-WAET-20140724/





The Web Accessibility Initiative’s creation of documents such as this can
serve as an important, unbiased, reference-able resource for users and
consumers of accessibility testing tools. While marketing documentation for
accessibility testing tools may make any number of claims regarding their
capabilities and downplay their shortcomings, this document provides an
important overview of features that could be important for users of such
tools. There are a handful of opportunities for improvement that should be
addressed prior to final publication. The following is a list of such areas
of necessary improvement.
General Comments Repeated references to EARL

This document makes 6 references to EARL (Evaluation and Report Language),
a reporting format that has been under development since 2001. This
specification has seen no measurable demand within the marketplace and
nearly zero implementations in major testing tools in industry.
Accessibility testing tools that have claimed to support this specification
either did not actually do so, no longer do so, or no longer exist. A
Google search for “Evaluation and Report Language” shows – in the first 10
pages of results - only one mention of EARL by a tool vendor: SiteValet, a
product that by all indications is abandonware. Filling in the remainder of
those 10 pages of results are mostly discussions of EARL by academics, W3C
employees, and various members of WAI-related working groups. There are no
results from product vendors. While EARL is admittedly a robust and
comprehensive reporting format, it is wholly irrelevant. Its mention in
this document serves no use for actual end users and consumers.  The
repeated references to EARL will only serve to confuse consumers into
believing they should look for products that feature such reporting format
when none actually exist.
Inclusion of outdated information

In section 1.1 Evaluation Tools, it mentions that “W3C Web Accessibility
Initiative (WAI) provides a list of web accessibility evaluation tools” and
provides a link to http://www.w3.org/WAI/ER/tools/ . This is followed by an
editor’s note that states:



*The list of web accessibility evaluation is currently out of date. It is
expected to be updated before the next publication of this document.*



This list is more than just a little out of date – it was last updated in
March 2006.  Of the items currently listed less than two-dozen of the
non-specialty tools (meaning color contrast checkers, etc.) are actually
still in existence. The current list includes tools sold by vendors that
have been bought twice-over.  Though this editor’s note exists currently,
it bears mentioning that if the referenced list of tools is not brought up
to date, this link should be removed from this document. The continued
mention of the out-of-date tools list does more to confuse than inform.
Specific Comments 2.1.1 Content Types

This section discusses, ostensibly, the content types which evaluation
tools can/ should test.  If this is in fact the case, then the following
comments apply:
CSS & JavaScript

Current state-of-the art for automated accessibility testing cannot,
will-not, and ought-not test CSS or JavaScript outside of the context of
the way they affect a web document being tested. The testing of both of
these content types is only relevant in such context due to their
relationship with the browser DOM.  The testing of CSS or JavaScript files
by themselves would be exceedingly complex and have little actual value for
accessibility evaluation.  For instance, actually testing color-contrast of
an object on a web page would require testing more than the style sheet
declarations and instead would necessitate testing the computed style for
the object.  Consider the following structure:



*<body>*

*<main>*

*<div id="some-stuff">*

*<!--content-->*

*</div>*

*</main>*

*</body>*



And consider the following styles for that HTML:



*body{*

*    background-color: #fff; *

*}*



*main{*

*   background-color: #333;*

*}*



*#some-stuff{*

*    color: #fff;*

*}*





Given the above, calculation of the color contrast for #some-stuff would
require measuring the text color of #some-stuff against the
background-color of the main element.  The necessary information to check
the color contrast of #some-stuff is not available outside of the browser
context. The same goes for the testing of JavaScript.  The ERT WG should
clarify this for the end user.



Additionally, I feel it would be especially important for the ERT WG to
mention that the testing of the browser DOM *is not* common among automated
testing tools, as I discuss here:
http://www.karlgroves.com/2013/09/06/web-accessibility-testing-tools-who-tests-the-dom/.
The testing of the rendered browser DOM is critical for accurate
accessibility testing of Web documents and readers need to know the
importance of this.
Media Resources

Accessibility testing of images is an emerging area for accessibility
testing tools.  Many image manipulation libraries exist which can – with
unfortunately low levels of accuracy – OCR images. Theoretically the OCR’d
text can be checked for sufficient contrast against their background.
Semi-automated capabilities exist as well and other theoretical
possibilities exist.  However, accessibility testing of media files is, as
far as I can tell, not happening and likely won’t happen until. Testing for
proper alternatives, existence & accuracy of captions/ transcripts, color
contrast, background noise, etc. is theoretical at best and certainly no
commercially viable tools currently do so.
2.2.2 Test Modes...

This section includes the claim “Since it is a known fact that automatic
tests only cover a small set of accessibility issues...”.  While I agree
with this statement, it’d be good to have a reference cited in this
section. It would also be prudent to disclose the likely disparity between
what can theoretically be tested versus what is commonly tested.
2.2.4 Test automation

The inclusion of WebDriver in this section is excellent, however many other
options exist. WebDriver truly excels at performing “tests that automate
the application's and the end-users' behaviour.”  In doing so, WebDriver
can help to automate acceptance testing which relies on user interaction to
trigger specific UI states that a webcrawler could not trigger on its own.
But this is neither the only tool for introspection or for test automation.
Many unit test frameworks exist which could be utilized for accessibility
testing.  While I acknowledge W3C’s desire to be vendor neutral, it does
seem prudent that the existence of such frameworks be acknowledged.  It
would also be prudent to mention that accessibility testing can be
performed as part of/ in conjunction with nearly any unit test or
acceptance testing framework.
2.3.1 Standard reporting languages

The statement “Support for standard reporting languages like EARL [EARL10]
is a requirement for many users.” is completely untrue.  Google search
results for "Evaluation and Report Language" number only 3,410. This number
speaks for itself in demonstrating the irrelevance of EARL as a “standard
reporting format”.
2.3.7 Error repair

Based on its content, this section would be more accurately titled “Error
repair guidance”.  As noted in the 2nd paragraph of this section, the
automated repair of issues found is a path down which one should not
travel. In fact, all notable tools that did so (and still exist) have
stripped this capability entirely.
3 Example profiles of evaluation tools

Again, the repetitious mention of EARL is of no use.



-- 

Karl Groves
www.karlgroves.com
@karlgroves
http://www.linkedin.com/in/karlgroves
Phone: +1 410.541.6829

www.tenon.io

What is this thing and what does it do?
http://vimeo.com/84970341

http://lanyrd.com/profile/karlgroves/

Received on Wednesday, 30 July 2014 08:12:28 UTC