- From: Userite <richard@userite.com>
- Date: Fri, 23 Mar 2018 12:10:36 -0000
- To: "Chris Leighton" <chris.leighton@uwa.edu.au>, <w3c-wai-ig@w3.org>
- Message-ID: <521F4BD57DDE4895B7010F8E48AAA42A@RichardPC>
Hi Chris, We have recently undertaken such an exercise for a local University We have done over 40 applications in the last twelve months. From that experience my advice is as follows: Go through the application manually to identify errors in relation to W3C Guidelines. If you are not fully conversant with these then find someone who is. To help you need to install an accessibility toolbar that list the headings, alt text, frames etc. This makes testing easier. Don’t use fully automated testing applications because they deliver too many false positives or negatives and cannot tell if things like alt tags are really meaningful. During your testing make sure that you also test using just the keyboard. You do not need to identify every single occurrence of a particular error. If an error is fairly common (say missing alt tags) then simply explain why it is an error and mention some examples. The developer can then take responsibility for making sure that all the occurrences are dealt with (s/he will know the application better than you so this is easier for him/her than you). Deliver a test report that simply scores each of the guidelines and explains briefly where some of the fails are and why the fail. Work with the IT staff and developer to facilitate speedy adjustments (or finding an alternative application if necessary) When, and only when, you think the application is in fairly good order you should arrange for some user testing. I have a small band of testers that know how to report problems, but it is not fair to ask them to do a test if you already know that they will have big problems! Finally, if it is an application that is a major part of the University offering work with the University to identify a range of disabled people to also test the application by performing the required tasks. Regards Richard Warren Technical Manager Userite www.userite.com From: Chris Leighton Sent: Friday, March 23, 2018 7:57 AM To: w3c-wai-ig@w3.org Subject: How to evaluate and report on many websites cohesively? Dear all, I'm musing over the best method to evaluate many websites under our Universities umbrella and to report on the same, but cohesively! Automation, manual audit and user testing by sampling will be included. Identifying actionable items and representative scoring over time are included in the objectives. I have these w.3.c. resources open: WCAG-EM Report Tool WCAG-EM Overview: Template for Accessibility Evaluation Reports plus a few others. While not a problem in itself the reporting concepts offered perhaps don't suggest at reporting on multiple sites in a cohesive manner. Is there any experience in the collective that I may lean on? I am interested in the basics and the highfalutin. Thanks in advance. Regards, Chris
Received on Friday, 23 March 2018 12:11:11 UTC