Re: Step 1.b Goal of the Evaluation - Design support evaluation vs. conformance evaluation?

Hi,

just some quick input in case you do cover my proposal to modify "Goal of the Evaluation" today.

I get that #3 In Depth analysis report is close to what I would call "design support test" (or "development support test")  since you usually conduict it when you *know* that the site will not conform - tyhe aim is to identify all the issues that nieed to be addressed before a conformance evluation has a chance to be successful.

Since it usually comes first, I find it odd that it is mentioned last, and that no hint is given that this is usually an evaluation where the aim is *not* a conformance evaluation (because you already know that there will be a number of issues that fail SCs).

The on thing lacking in goal #3 is the requirement to cover all SCs acros the sample of pages (with or without detail) and by doing so, providing a benchmark for the degree of conformance already reached - even if it is necessarilz a crude one.

So there are 2 things that are missing in the three types of goals we have now:

(1) a clear indication (in the name of the report type) that there is one evaluation that does *not* aim for measuring conformance but happens in preparation of a final test, with the aim to uneath problems;
(2) the ability in this tpe of test to provide a metric of success across all SCs for the pages in the sample that can be compared to a later conformance evaluation of the same site.

Sorry, I would have loved to participate today but my voice isn't up to it...

Best,
Detlev
On 5 Jun 2013, at 16:34, Velleman, Eric wrote:

> Hi Detlev,
> 
> tend to look at the more detailed explanation of the three types of Reports in Step 5.a [1]: 
> 
> 1. Basic Report
> 2. Detailed Report
> 3. In-Depth Analysis Report
> 
> For me the difference between #2 and #3 is in the level of detail that is required in the Report. #2 is more on the page level, and #3 is more on the website level:
> 
> #3 is a way of reporting that does not require you to name every failure on every page. The evaluator is asked to give a certain amount of examples of the occurrence of the failures on the website (not every page like in the detailed report). This makes #2 better for statistics and research.
> 
> Does this make sense?
> 
> Eric
> 
> 
> [1] http://www.w3.org/TR/WCAG-EM/#step5
> ________________________________________
> Van: Detlev Fischer [detlev.fischer@testkreis.de]
> Verzonden: donderdag 30 mei 2013 17:15
> Aan: public-wai-evaltf@w3c.org
> Onderwerp: Step 1.b Goal of the Evaluation - Design support evaluation vs. conformance evaluation?
> 
> Hi everyone,
> as promised in the telco, here is a thought on the current section "Goal of the Evaluation".
> 
> Currently we have:
> 1. Basic Report
> 2. Detailed Report
> 3. In-Depth Analysis Report
> 
> For me, 2 and 3 have always looked a bit similar as there is no clear line between specifiying issues on pages and giving advice as to improvements (often, you cannot not easily specify remedies in detail because as testers we are often not familiar with the details of the development environment).
> 
> In the discussion it struck me that we seemed to have a (largely?) shared notion that our evaluation work usually falls into one of 2 categories:
> 
> 1. Design support evaluation: Take an (often unfinished) new design and find as many issues as you can to help designers address & correct them (often in preparation for a future conformance evaluation/ conformance claim)
> 2: Conformance evaluation: Check the finished site to see if it actually meets the success criteria (this may take the form of laying the grounds for a conformance claim, or challenging a conformance claim if a site is evaluated independently, say, by some organisation wanting to put an offender on the spot).
> 
> Most of our work falls into one of these two categories, and you won't be surprised that we sell design support tests (one tester) as preparation for final tests (in our case, two independent testers). (And I should mention that our testing scheme currently does not follow the 100% pass-or-fail conformance approach.)
> 
> There is actually a third use case, which is checking old sites known to have issues *before* an organisation starts with a re-design - so they see the scope of problems the re-design will need to address (and also be aware that there may be areas which they *cannot* easily address and determine how to deal with those areas).
> 
> Sorry again to raise this point somewhat belatedly. Hope this will trigger a useful discussion.
> Best,
> Detlev
> 
> 
> --
> Detlev Fischer
> testkreis c/o feld.wald.wiese
> Thedestr. 2, 22767 Hamburg
> 
> Mobil +49 (0)1577 170 73 84
> Tel +49 (0)40 439 10 68-3
> Fax +49 (0)40 439 10 68-5
> 
> http://www.testkreis.de
> Beratung, Tests und Schulungen für barrierefreie Websites
> 
> 
> 
> 
> 

-- 
Detlev Fischer
testkreis - das Accessibility-Team von feld.wald.wiese
c/o feld.wald.wiese
Thedestraße 2
22767 Hamburg

Tel   +49 (0)40 439 10 68-3
Mobil +49 (0)1577 170 73 84
Fax   +49 (0)40 439 10 68-5

http://www.testkreis.de
Beratung, Tests und Schulungen für barrierefreie Websites

Received on Thursday, 6 June 2013 14:03:33 UTC