Re: Evaluation scheme with three options - proposal

Hi Elle,

Most of our clients (65%) first ask order a simple test (pass / fail scorecard with minimal comments where relevant). Approximately 20% of these go on to request a full audit or training. 

We also do a free “spot check” which just highlights a couple of issues with a very general overview. This is very popular, but only 10% lead to paying work !!!

Richard

From: Elle 
Sent: Sunday, February 19, 2012 7:36 PM
To: Kerstin Probiesch 
Cc: EVAL TF 
Subject: Re: Evaluation scheme with three options - proposal

Kerstin: 

I like these three options.  I am interested, however, in how many clients that typically ask for something as abbreviated as Option 1.  For those in this group, do you experience situations with a lot of clients who don't want more than the pass/fail report?



Regards,
Elle





On Sun, Feb 19, 2012 at 4:36 AM, Kerstin Probiesch <k.probiesch@googlemail.com> wrote:

  Hi all,

  in our last teleconference we discussed a evaluation scheme with three
  options based upon 100% Conformance. I appreciate these proposals and see
  them as chance to integrate or point to the three documents of WCAG2:
  Guidelines and SCs, Understanding and How to meet.

  One proposal for handling the documents in an evaluation scheme, based upon
  the normative guidelines and SCs as core:

  =====
  Option 1: WCAG 2.0 – Core Test ("light version" or whatever the wording
  later will be)

  # Guideline X (Heading)

  ## Checkpoint: SC XX (Subheading)

  Result: pass/fail

  Character: global/regional (or another wording) - – if regional: a list of
  pages where the problem exists

  ## Checkpoint: SC XX (Subheading)

  Result: pass/fail

  Character: global/regional (or another wording) - – if regional: a list of
  pages where the problem exists

  (...)

  =====

  Use cases for Option1:

  - experienced developers and clients who know WCAG2 and need just the
  results,
  - comparative evaluations (20 hotel websites, city websites…)
  - or for example just with the SCs of level a and a smaller scope as
  pre-test to decide together with the client what the best next steps might
  be (evaluation, consulting, probably workshops for editors)

  =====

  Option 2: WCAG 2.0 – Core incl. understanding (name?)

  # Guideline X (Heading)

  ## Checkpoint: SC XX (Subheading)

  Result: pass/fail

  Character: global/regional (or another wording) – if regional: a list of
  pages where the problem exists

  Problem (Subheading): Description of existing problems and barriers for
  users (here know how out of the understanding document could be part of the
  description).

  ## Checkpoint: SC XX (Subheading)

  Result: pass/fail

  Character: global/regional (or another wording) – if regional: a list of
  pages where the problem exists

  Problem (Subheading): Description of existing problems and barriers for
  users (here know how out of the understanding document could be part of the
  description).

  (...)

  ======

  Use cases:

  - comparative evaluations (depending on the specific time and costs)

  - if a client just want descriptions

  - regular tests like "evaluation of the week"

  =====

  Option 3: WCAG 2.0 – Core, understanding, how to meet (name?)

  # Guideline X (Heading)

  ## Checkpoint: SC XX (Subheading)

  Result: pass/fail

  Character: global/regional (or another wording) – if regional: a list of
  pages where the problem exists

  Problem (Subheading): description/explanation of existing problems and
  barriers for users (here know how out of the Understanding Document could be
  part of the description).

  Action (Subheading): Description of techniques for meeting the SC (could be
  techniques which are already in the techniques document or new techniques
  which are not in the document, but with which the SC can be met). Here even
  usability aspects can play a role, like: you can do a, b, c or d – I/we
  propose/recommend c.

  ## Checkpoint: SC XX (Subheading)

  Result: pass/fail

  Character: global/regional (or another wording) – if regional: a list of
  pages where the problem exists

  Problem (Subheading): description/explanation of existing problems and
  barriers for users (here know how out of the Understanding Document could be
  part of the description).

  Action (Subheading): Description of techniques for meeting the SC (could be
  techniques which are already in the techniques document or new techniques
  which are not in the document, but with which the SC can be met). Here even
  usability aspects can play a role, like: you can do a, b, c or d – I/we
  propose/recommend c.

  (...)

  ======

  Use cases:

  - test incl. consulting

  - for clients who are not very familiar with accessibility and WCAG2

  ============

  For a seal/badge or any formal confirmation Option 1 is the minimum.

  A report might also / should? also have intro parts like:

  - Short description of the Option 1, 2 or 3

  - Something like a disclaimer ("results might not be complete, therefore it
  is important to go through the page, view all similar elements and solve the
  corresponding problems)

  - Glossary (for specific terms we used in our methodology -like
  regional/global – if we decide to use them)

  - Documentation of the used OS, Browsers and Versions, probably used
  assistive technologies incl. versions

  - Tested Conformance Level (A, AA, AA)

  - Results

  - Summary, probably written as an overall impression - we discussed in this
  list the 'motivation factor'. I think the aim of an evaluation is not to
  motivate. Nevertheless, writing a nice overall impression in a report, may
  have this function. Ok, except when there is nothing nice to say.

  This scheme could probably also be used for processes, pdf, flash and so on
  and I think it would be flexible enough (time, costs, ...) and in the same
  time valid against the Conformance Requirements, because the core
  (evaluation itself) is the same in every option.

  Important, as I see it, is that the evaluator has the three different
  aspects in mind and in the report, which I believe shouldn't be mixed:
  evaluation (Core, testing SCs), explanation (description of the
  problem/violation, understanding) and consulting (how to meet, usability,..)


  The evaluator could document the "progress toward meeting success criteria
  from all levels beyond the achieved level of conformance": If for example
  the evaluation is for Level A with Option 3 the SCs of AA could also be
  checked (pass/fail) without any further description or with further
  description, depending on the contract.

  Advantage: every evaluator or testing organization uses the methodology and
  a standardized 'template' for the core and the evaluation itself. The
  descriptions of existing barriers (explanatory part/understanding in Option
  2 and 3) and the consulting part (How to meet, in Option 3) would be the
  specific added value for the clients/the evaluator/the testing organization.


  Thoughts?

  Best

  --Kerstin


  -------------------------------------
  Kerstin Probiesch - Freie Beraterin
  Barrierefreiheit, Social Media, Webkompetenz
  Kantstraße 10/19 | 35039 Marburg
  Tel.: 06421 167002
  E-Mail: mail@barrierefreie-informationskultur.de
  Web: http://www.barrierefreie-informationskultur.de

  XING: http://www.xing.com/profile/Kerstin_Probiesch
  Twitter: http://twitter.com/kprobiesch
  ------------------------------------







-- 

If you want to build a ship, don't drum up the people to gather wood, divide the work, and give orders. Instead, teach them to yearn for the vast and endless sea.    
- Antoine De Saint-Exupéry, The Little Prince

Received on Sunday, 19 February 2012 20:57:03 UTC