- From: Henrike Gappa <henrike.gappa@fit.fraunhofer.de>
- Date: Sat, 02 Jan 2010 18:56:38 +0100
- To: wai-eo-editors@w3.org
Dear editors, Please find in the following some comments in regard to the posted notes: - Involving Users in Web Projects for Better, Easier Accessibility (http://www.w3.org/WAI/users/involving) - Involving Users in Evaluating Web Accessibility (http://www.w3.org/WAI/eval/users) To "Involving Users in Web Projects for Better, Easier Accessibility" We support wholeheartedly encouraging web site developers and owners to conduct user testing for better accessibility. However, we feel that the limitations of the proposed test methodology needs to be addressed much more clearly. For instance, to stress that the test results obtained with only a small group of disabled users can only be understood as informative, also in regard to the generalization of test results to users with the same characteristics, e.g., blind or screen-reader users. There is a lot of diversity among people with the same disability even when utilizing the same assistive technology. Furthermore, to our knowledge, it is a precondition to have a deep insight into accessibility guidelines and, at least to some extent, experience with assistive technologies to really employ user testing for better accessibility and to create accessible web applications. In other cases, its benefit is more of motivational and educational nature which is of great value and thus highly recommendable. The described issues are somehow mentioned in the document, but should be pointed out much more explicitly. For instance when contrasted with the section "More Efficient Development", the limitations of the proposed methodology will not be identified most likely by the uninformed user. Besides this, we are afraid it might not become clear to the reader which user groups are addressed by the note. Most of the time the note names only people with disabilities, sometimes it is people with disabilities and older people and sometimes it is only "users", for instance, in the section "More Efficient Development". Suggesting as test methodology to "ask a lot of questions" to gather user data is vague. We propose to be more concrete. Since tests sessions have a time limit, it is advisable for many test purposes to develop at least a questionnaire guide to ensure that all relevant issues are touched and that test results are comparable in case there are several test participants. To "Involving Users in Evaluating Web Accessibility" Our first comment in regard to this note is -- as it was for "Involving Users in Web Projects for Better, Easier Accessibility" -- that the limitations of the described evaluation methodology need to be pointed out more clearly. We are in favour of user testing, and agree that user testing may reveal accessibility issues that would not have been detected via standard conformance testing. However, it is also important to note that many accessibility errors will definitely not be detected by user testing alone. Therefore, different from what is stated in the introduction of this article, to our knowledge, conformance checks to all relevant accessibility guidelines are not only "important" but need to be understood as the basic of all accessibility evaluations. We would also suggest conducting a full review instead of a preliminary one and fix all errors, before bringing in users to avoid operation errors of the assistive technologies due to accessibility errors in the code. In the section on "Range of User Evaluation" informal vs. formal usability evaluation are compared. We think it would be important to also explain here differences of the outcome in terms of significance and validity. Also gains and limitations of involving only a few people with disabilities as stated in the section "Basics", should also be pointed out clearly to the reader, so the reader is able to conclude correctly what will be an appropriate test scenario for her purposes. In the section "Drawing Conclusions and Reporting", the editors state the need to be careful when "... drawing conclusions from limited evaluations ....". However we feel that this is not explicit enough, because valid conclusions cannot really be drawn from such user testing, and this should be clear to the reader. The problem is not so much the lack of statistical significance as mentioned, but the limited validity and generalisability of test results. In the section on "Analyzing Accessibility Issues", it is proposed to assign occurring accessibility issues to the origin, e.g., "the developer did not markup/code the web page properly" or "the user's AT isn't handling the markup properly". From our experience with accessibility audits, this can only be achieved by accessibility experts, which means for a reliable judgement, that the evaluator needs to have deep knowledge about HTML, accessible web coding and standard behaviour of AT. Otherwise, such mappings are in danger of being error-prone. Finally, the section "More Information and Guidance" provides information on user testing specifically for usability professionals. Here, we do not really understand what is meant by "usability testing for accessibility". Both, user testing for usability as well as accessibility issues are based on methods and techniques derived from Psychology and related sciences, e.g., work psychology or software ergonomics. Thus to our understanding there are usability tests and accessibility tests which follow different goals. They might overlap in regard to certain sub-goals and methodologies employed, yet, the goal is quite different and both disciplines should not be intermixed to our understanding. Kind regards, Henrike Gappa, Gabriele Nordbrock and Carlos Velasco -- Henrike Gappa Fraunhofer-Institut für Angewandte Informationstechnik FIT [Fraunhofer Institute for Applied Information Technology FIT] Web Compliance Center - http://www.fit.fraunhofer.de/ imergo®: http://imergo.com/ http://imergo.de/ Schloss Birlinghoven, D 53757 Sankt Augustin (Germany) Tel: +49 2241 14-2793 Fax: +49 2241 14-2065
Received on Monday, 4 January 2010 13:26:03 UTC