WCAG-EM feedback

Hi All


Digital Access at Vision Australia have reviewed WCAG-EM and applied the methodology to a recent audit we undertook on a government website with dynamically generated content. The approach currently used by Digital Access is in line with procedures outlined by the Unified Web Evaluation Methodology (UWEM) to generate an evaluation sample that includes representative key pages and functionalities of the website. We support UWEM with similar approaches to those outlined in WCAG-EM. In summary, we support the overall approach, methodology and recommendations to undertaking an accessibility audit as outlined in WCAG-EM, and believe it can add value to our current approach.



There are a few questions/comments we raised as a group as identified below for your consideration, happy to discuss further:



Step 2.d: Identify Web Technologies Relied Upon

Agree that it may be useful to identify the libraries and components used to create the website, such as Dojo, JQuery for particular development teams and website owners. Appropriate as a Note and not part of main requirement.



Step 3: Select a Representative Sample and Step 5.a: Provide Documentation for Each Step

During an evaluation we use an in-house spider tool to assist with Step 2: Explore the Target Website, that in turn informs Step 3: Select a Representative Sample. The tool is not used to identify if something passes or fails, but to drill down into the taxonomy to identify different content types, dynamic content, components and technologies used. We often move into evaluating pages not in the original sample by way of following a process or link/button on the page.



When undertaking the audit against WCAG-EM, following the strict sampling steps and reporting a comprehensive list of pages included in the sample set added additional time to the evaluation process. This time would have to be added to the full cost of the evaluation that the client will need to billed for. We would question the cost/benefit value of providing a full list of pages evaluated over providing the corresponding URL(s) for each example(s) of an issue and the scope from which the sample set was derived (domain and subdomains included or excluded).



Step 3.e: Include a Randomly Selected Sample

5% or 5 pages... what benefits do they provide? From our experience following the approach of the previous steps such as Methodology Requirement 2.c will ensure pages are representative of the whole, potentially reducing the value of an additional random sample.



Step 4.a: Check for the Broadest Variety of Use Cases

It is strongly recommended to also involve real users during this process.... Whilst we completely agree with this and conduct 'technical' testing with end users during WCAG evaluations, it is no substitute for formal usability testing. 'Practical' accessibility issues are just as important as 'technical'  accessibility issues but may be disregarded when assessed in the context of a technical evaluation as they do not always directly map to WCAG 2.0.



We all interact with websites using different browsers, different ATs and different approaches. Determining the true nature of an accessibility issue when involving users is not as straightforward as merely asking them what they believe is the issue, or reporting on general observations as with mainstream users. To undertake comprehensive formal usability testing for accessibility we believe there is a real requirement to understand user experience, technical accessibility and how ATs are used and their capabilities; it is rare you find people with all of these skills and experience.



Step 4.b: Assess Accessibility Support for Features

As there is no definitive list to define what technologies or particular AT is classified as 'Accessibility Supported' it is not always easy to establish this without the subjectivity between website owners coming into play.



"The Working Group, therefore, limited itself to defining what constituted support and defers the judgment of how much, how many, or which AT must support a technology to the community and to entities closer to each situation that set requirements for an organization, purchase, community, etc."  http://www.w3.org/TR/UNDERSTANDING-WCAG20/conformance.html#uc-support-level-head



Step 4.c: Use Techniques and Failures Where Possible (Optional)

In an Australian Government context this is not optional.



Step 5.c: Provide a Performance Score (Optional)

The document highlights some of the pros and cons of the three suggested approaches and we can see the value for an organisation which internally wishes to benchmark their accessibility over time.



Our concerns for the wider use of performance scores presented to the public are detailed below (in no particular order). We raise this as we have been asked to provide a score, percentage or similar ranking measure on numerous occasions. However, we do not believe the intention behind these requests on the whole are beneficial to the needs of a person with a disability or age related impairment:



*         All three approaches fail to include practical accessibility issues not captured, or intended to be captured by a WCAG assessment. The final score may not be an accurate reflection of the lived experience for a person with a disability using the website

*         Website owners may remove resources to rectify outstanding WCAG issues once they achieve a score they deem to be adequate

*         May lead to a reduced incentive to undertake formal usability testing by website owners once they achieve a score they deem to be adequate

*         From what we can establish there will be a variance in the final score of the three approaches if applied to the findings of the same audit, so there will be some discrepancy around how accessible a website is

*         Website owners may wish consultants to use the most favorable of the three approaches if the score is to be made public

*         None of the three approaches appear to reflect the impact of an issue on an end-user (level A issue over level AAA issue), so are somewhat arbitrary

*         To make the performance scoring meaningful it involves quite a lot of additional effort from the evaluator; only Per Website would realistically be achievable for a commercial engagement, and is the value add really there.

Hope you find our feedback of value, and it is not too late

All the best
Neil

---
Neil King
National Manager Digital Access
Vision Australia
4 Mitchell St, Enfield, NSW 2136
P: 02 9334 3547
M: 0426 241 870
E: neil.king@visionaustralia.org<mailto:neil.king@visionaustralia.org>
W: http://www.visionaustralia.org/digitalaccess


________________________________
To help organisations achieve website, mobile app, Word and PDF accessibility, the Digital Access team at Vision Australia offers specialised accessibility consulting, auditing, testing and training services. Our advice and training is based on WCAG 2.0 and best practice from our extensive knowledge of working with people with a large variety of abilities. Visit our website to find out more about our digital accessibility services.

ABN: 67 108 391 831 ACN: 108 391 831

Vision Australia is a partnership between people who are blind, sighted or have low vision. We are united by our passion that people who are blind or have low vision will have access to and fully participate in every part of life they choose.

This email (including its attachments) is confidential and may contain legally privileged material or personal information. If you are not a named addressee you must not use, disclose, copy, disseminate or print the email or any information in it. If you have received this email in error please notify us immediately and delete the email and any copies.

Vision Australia is not responsible for any changes made to a document other than those made by Vision Australia or for the effect of the changes to the document's meaning. Vision Australia accepts no liability for any damage caused by this email or its attachments due to viruses, interference, interception, corruption or unauthorised access.

______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
For more information please visit http://www.symanteccloud.com
______________________________________________________________________

Received on Wednesday, 3 April 2013 05:46:56 UTC