Re: R: some initial questions from the previous thread

Just picking up Roberto Scano's question about the scope of evaluation 
method to be developed, and Shadi's answer.

I guess it is always difficult to make a process or methodology scale 
well - that's why I somewhat awkwardly tried to raise the topic of the 
'range' of the proposed methodology in the frist EVAL TF teleconference.

I agree with Shadi that the methodology should be useful both for 
self-assessment and third-party conformity assessment.

As an example for different types or layers of evaluation based on a 
common methodology, let me describe our own approach, BITV-Test (BITV 
equals WCAG, with a few minor differences) as a mere example (at the 
risk of appearing just to toot my own horn here).

All three levels are based on the same 50 (German language) checkpoints
http://testen.bitvtest2.de/index.php?a=dl&t=s

Level 1 - the free self-assessment tool just requires prior 
registration. It uses the same metrics as the full-blown test but does 
not differentiate by page. You will have guessed that this tool does not 
lead to any kind of conformance statement / accessibility seal. You can 
use it as a heuristic to check WCAG / BITV requirements across any 
number of pages. You can enter comments per checkpoint (like a list of 
issues / to-do list for designers) and generate a PDF report of the 
aggregated results.

Level 2 - On this level, the testing application is used by a single 
tester who carries out a BITV design support test. Often, sites that 
want to be approved by us ungergo this pre-test to ensure that the site 
will reach a score 90 points (of 100) or higher in the final conformance 
test. The page selection can be done by the tester or be suggested by 
the customer (they my want to test just some new feature, process or 
layout they are uncertain about). This test too does not leads to any 
conformance statement, of course.

Level 3 - this then is the final conformance test. Page selection is 
made by the tester - if legacy areas of a site are excluded, this has to 
be made clear on the site. The number of pages required in the sample 
depends on the complexity / number of templates of the site, but the 
approach is similar to what Denis has described. Additional states of 
pages (and how you call them up) are described as part of the 
(documented) page selection. Then the test is carried out separately by 
two independent testers. Once both testers are finished, the testing 
tool highlights the checkpoints that were ranked or commented 
differently and the arbitration process involving both testers goes 
through all these to find a consensus rating (and appropriate comments). 
The more experienced the testers, the closer there are normally to the 
final arbitrated result - so the offset in rankings is also a metric for 
the qualification level of testers.

There is a requirement that the 90+ conformance seal links to the HTML 
version of the test report, so all judgements are documented and can be 
checked by calling up the respective pages tested.

Of course I know there will be as many ways to do this as there are 
testing tools...

Detlev

Am 22.08.2011 15:53, schrieb Shadi Abou-Zahra:
> Hi Roberto,
>
> On 22.8.2011 14:33, Roberto Scano (IWA/HWG) wrote:
>> Hi one question: are we thinking about a single evalutation method
>> or evalutation with different targets (eg: companies, public
>> administration, etc.)?
>
> How would an evaluation of private sector website be different from a
> public sector website?
>
>
>> So for be more clear we are planning to make a work for help
>> developers for auto-evalutate their works or we are thinking about
>> something that can be useful to companies / governments that need to
>> have metrics for evalutate web-based products?
>
> The methodology should be usable for self-assessment as well as for
> third-party/conformity assessment.
>
> Best,
> Shadi
>


-- 
---------------------------------------------------------------
Detlev Fischer PhD
DIAS GmbH - Daten, Informationssysteme und Analysen im Sozialen
Geschäftsführung: Thomas Lilienthal, Michael Zapp

Telefon: +49-40-43 18 75-25
Mobile: +49-157 7-170 73 84
Fax: +49-40-43 18 75-19
E-Mail: fischer@dias.de

Anschrift: Schulterblatt 36, D-20357 Hamburg
Amtsgericht Hamburg HRB 58 167
Geschäftsführer: Thomas Lilienthal, Michael Zapp
---------------------------------------------------------------

Received on Tuesday, 23 August 2011 11:50:03 UTC