- From: Phill Jenkins <pjenkins@us.ibm.com>
- Date: Wed, 27 May 2015 09:44:31 -0500
- To: wai-eo-editors@w3.org
- Cc: Sharron Rush <srush@knowbility.org>
- Message-ID: <OF71CF323A.46B1A691-ON86257E52.004EB9A2-86257E52.0050FBD2@us.ibm.com>
I see two major pieces of information missing from the methodology and
reporting tool:
1. Scope (what is also called the "requirements step" by other
professionals) is missing the Tools and audit techniques that will be
used. It is critical that this is agreed to up front and documented in
the "plan" since not all tools and techniques results in the same
findings! Tools and techniques are divided into 3 types:
1. Semi-automated Checking tools such as Deque WorldSpace, SSB
Bart AMP, IBM Mobile Accessibility Checker, Wave, HTML code validations,
and others
2. Compatibility Testing using assistive Technology such as JAWS,
NVDA, VoiceOver, TalkBack, etc.
3. Subject Matter Expert (SME) Inspection using software tools and
simulators such as Firefox Accessibility Extension, IE Toolbar, source
code inspections, etc.
Note: that these three types are not the same as doing an end-user
assessment using participants with disabilities. User research and design
evaluations are scoped very differently than accessibility verification
test (AVT), accessibility assessments and audits. User Expereince (UX)
research is done with designs (e.g. wireframes) and/or prototypes, while
AVT is done with pre or post production code.
2. Test Data: The userIDs, passwords, and data needed to explore the web
app or mobile app. I'm assuming that "URLs section" includes the test
server and test enviromments needed to conduct the evaluations; but a note
should be added to make that explicit. Test Data typically indlues data
such as account numbers, zip codes, values that create certain screens,
and data that causes error conditions. All of these need to be gathered
and documented. Typicall documented in the Scope step and must be
included before the Explore step can be completed.
Without this level of robustness, the evaluation methodology is not
repeatable and the reporting tool's validity and value is suspect.
____________________________________________
Regards,
Phill Jenkins,
IBM Accessibility
----- Forwarded by Phill Jenkins/Austin/IBM on 05/27/2015 09:19 AM -----
From: Sharron Rush <srush@knowbility.org>
To: w3c WAI List <w3c-wai-ig@w3.org>
Date: 05/27/2015 09:02 AM
Subject: CORRECTION: WCAG-EM Report Tool feedback welcome
Hello again,
I hope some of you have been looking at the Report Tool. I made an error
in the suggested feedback pathway. and here is the corrected information.
Rather than clutter the list, please send your comments through the
channel noted in the footer of the Tool itself:
"Feedback: We welcome ideas, bug reports, and comments via GitHub or
e-mail to wai-eo-editors@w3.org (a publicly archived list) or wai@w3.org
(a WAI staff-only list)."
Thanks, we hope to hear from you!
Best,
Sharron
On Tue, May 26, 2015 at 12:02 PM, Sharron Rush <srush@knowbility.org>
wrote:
Greetings all,
In March, WAI released the first version of the WCAG-EM Report Tool. [1]
We got some useful feedback at AccessU earlier this month and are
interested in more information from our community about how the tool is
being used.
Please send email here or to me off-list if you prefer. We are seeking
input about how the tool is useful in your assessment work, what barriers
you may have found to understanding or using it, and what would improve
the tool for your situation.
The plan is to gather feedback and use it for the next iteration of the
Report Tool. Thanks for any information you can provide.
[1] http://www.w3.org/WAI/eval/report-tool/#/
Best,
Sharron
---
Sharron Rush
Co-chair, EOWG
--
Sharron Rush | Executive Director | Knowbility.org | @knowbility
Equal access to technology for people with disabilities
Received on Wednesday, 27 May 2015 14:45:05 UTC