Re: Comments WAET

Hi Wilco, hi all,

here are my thoughts on the WAET - to be discussed in the meeting this
afternoon.

Kind regards
Annika


== Review of http://www.w3.org/TR/2014/WD-WAET-20140724/ ==

Abstract:
"Features to specify and manage (...) web accessibility evaluations".
The aspect of managing web accessbility evaluation is not taken up in 
the features. "2.4.1 Workflow integration" focuses mainly on developers. 
But the person responsible for managing the process (e.g. of creating a 
new web site) is usually not the developer.

2.1.1 Content types
"From the accessibility standpoint, the evaluation of these resources is 
relevant for issues like colour contrast, colour blindness or media 
alternatives, for instance." Resources can't be colour blind. 
Suggestion: "colour differentiation" or "distinguishability"?

2.2 Testing functionality
Suggested feature: Support users in manual testing by emulating 
assistive technologies (such as screen readers).

2.3 Reporting and monitoring
For users wanting to import/export/compare testing results, the major 
challenge ist to align the test results from different sources. This is 
related to "2.3.3 Import/export functionality" and "2.3.5 Results 
aggregation" but could also be added as a new feature.
Suggested new feature: The results use a common way of identifying the 
accessiblity problems that are reported. This could be WCAG 2.0 
Techniques or Success Criteria.




Am 06.08.2014 um 16:14 schrieb Wilco Fiers:
> Dear ERT,
>
> I just wanted to say I think you all did a great job on the WAET. I've written up a few thoughts I had while reviewing the  public draft. I've asked the members of the auto-wcag community group to see if they can review the document as well. Hope this is of some help for you all! Looking forward to see how the document will develop further.
>
> Regards,
>
> Wilco Fiers
> Accessibility Foundation
>
>
> - COMMENT 1 -
> 2.1.1. Content types : This confused me a bit, because of the word 'content'. In WCAG the word 'content' means something different then it does in HTTP. I think for WCAG what is called 'content' here is actualy 'technologies'. Maybe something like "Processed technologies" is clearer, as the main question here seems to be, does the tool look at just the HTML, or does it take CSS, Javascript, etc. into account?
>
> - COMMENT 2 -
> A feature I miss that relates to automated tools is reliability benchmarking. There are big differences between the reliability of different automated tools. Knowing how many tests a tool has and how reliable their findings are can be important. When you use a tool that monitors large numbers of web pages it is more important that a tool provides reliable results. But when you are developing a website it is important that a tool gives you as many potential issues as it can find and let the developer figure out what are real issues and what are false positives.
>
> - COMMENT 3 -
> 2.4.1 Workflow integration mentions bug tracking. I would like this to be a little more extensive. For instance, are there protocols that bug/issue trackers use that are recommended? How should you ensure that the same issue get's logged multiple times, either because it comes from a second evaluation or because it's an issue that is in a templated and so it repeats on many pages?
>

Received on Thursday, 7 August 2014 09:41:42 UTC