- From: Birkir Gunnarsson <birkir.gunnarsson@deque.com>
- Date: Thu, 7 Aug 2014 08:46:26 -0400
- To: "'Annika Nietzio'" <an@ftb-volmarstein.de>, <public-auto-wcag@w3.org>
Regrets for today´s call guys. This week has been crazy for me due to our project schedule and my colleages' summer vacation plans. For some reason early August is when everyone is taking their time off. I have been the only guy working on a project where we usually have 4 people, and there is a lot of high level meeting and such that I need to attend. Next week is already better, after that I go on vacation (except I will attend our meeting that week and work on our stuff during my vacation from my dayjob so I will be productive). After that it is back to normal. Behave kiddos, don't do anyting I wouldn't do, whatever that is! -----Original Message----- From: Annika Nietzio [mailto:an@ftb-volmarstein.de] Sent: Thursday, August 7, 2014 5:41 AM To: public-auto-wcag@w3.org Subject: Re: Comments WAET Hi Wilco, hi all, here are my thoughts on the WAET - to be discussed in the meeting this afternoon. Kind regards Annika == Review of http://www.w3.org/TR/2014/WD-WAET-20140724/ == Abstract: "Features to specify and manage (...) web accessibility evaluations". The aspect of managing web accessbility evaluation is not taken up in the features. "2.4.1 Workflow integration" focuses mainly on developers. But the person responsible for managing the process (e.g. of creating a new web site) is usually not the developer. 2.1.1 Content types "From the accessibility standpoint, the evaluation of these resources is relevant for issues like colour contrast, colour blindness or media alternatives, for instance." Resources can't be colour blind. Suggestion: "colour differentiation" or "distinguishability"? 2.2 Testing functionality Suggested feature: Support users in manual testing by emulating assistive technologies (such as screen readers). 2.3 Reporting and monitoring For users wanting to import/export/compare testing results, the major challenge ist to align the test results from different sources. This is related to "2.3.3 Import/export functionality" and "2.3.5 Results aggregation" but could also be added as a new feature. Suggested new feature: The results use a common way of identifying the accessiblity problems that are reported. This could be WCAG 2.0 Techniques or Success Criteria. Am 06.08.2014 um 16:14 schrieb Wilco Fiers: > Dear ERT, > > I just wanted to say I think you all did a great job on the WAET. I've written up a few thoughts I had while reviewing the public draft. I've asked the members of the auto-wcag community group to see if they can review the document as well. Hope this is of some help for you all! Looking forward to see how the document will develop further. > > Regards, > > Wilco Fiers > Accessibility Foundation > > > - COMMENT 1 - > 2.1.1. Content types : This confused me a bit, because of the word 'content'. In WCAG the word 'content' means something different then it does in HTTP. I think for WCAG what is called 'content' here is actualy 'technologies'. Maybe something like "Processed technologies" is clearer, as the main question here seems to be, does the tool look at just the HTML, or does it take CSS, Javascript, etc. into account? > > - COMMENT 2 - > A feature I miss that relates to automated tools is reliability benchmarking. There are big differences between the reliability of different automated tools. Knowing how many tests a tool has and how reliable their findings are can be important. When you use a tool that monitors large numbers of web pages it is more important that a tool provides reliable results. But when you are developing a website it is important that a tool gives you as many potential issues as it can find and let the developer figure out what are real issues and what are false positives. > > - COMMENT 3 - > 2.4.1 Workflow integration mentions bug tracking. I would like this to be a little more extensive. For instance, are there protocols that bug/issue trackers use that are recommended? How should you ensure that the same issue get's logged multiple times, either because it comes from a second evaluation or because it's an issue that is in a templated and so it repeats on many pages? >
Received on Thursday, 7 August 2014 12:46:54 UTC