Re: Not Applicable (was Re: Evaluation scheme with three options - proposal)

Thanks for the explanation, Shadi. I imagine it took some discussion to 
reach that consensus!  :^)

Mike

On 2/20/2012 2:30 PM, Shadi Abou-Zahra wrote:
> Hi Mike,
>
> Good question. We had a long discussion about that and also asked the 
> WCAG Working Group on their position on this.
>
> According to WCAG WG, the term "Not Applicable" is not defined and is 
> ambiguous. Accessibility requirements are deemed met when the content 
> does not require specific accessibility features. For example, the 
> requirement for captioning is deemed met if there is no video content.
>
> I will try dig out the corresponding pointers but I recall that this 
> was something that was less clearly documented in the WCAG documents. 
> We will probably need to clarify this point somewhere in section 5 of 
> the Methodology, and possibly as WCAG WG to also clarify some of their 
> materials (so that we can refer to it from our explanation).
>
> Best,
>   Shadi
>
>
> On 20.2.2012 19:26, Michael S Elledge wrote:
>> Hi Shadi--
>>
>> I noticed in the BAD example that success criteria for which there was
>> no related web content received a "Pass." I'm curious why that approach
>> was chosen rather than to identify such instances as "Not Applicable" or
>> "NA." Wouldn't using the term "NA" be both more informative and 
>> accurate?
>>
>> Mike
>>
>> On 2/20/2012 10:43 AM, Shadi Abou-Zahra wrote:
>>> Small addition:
>>>
>>> On 20.2.2012 16:28, Shadi Abou-Zahra wrote:
>>>> Hi Kerstin, All,
>>>>
>>>> I'm not too sure what the difference between options #1 and #2 
>>>> would be
>>>> in practice, as I hope that evaluators will simply link to Techniques
>>>> rather than to attempt to explain the issues themselves.
>>>>
>>>>
>>>> Here is an example of what a report of option #1 could look like:
>>>> - <http://www.w3.org/WAI/demos/bad/before/reports/home.html>
>>>
>>> Here is a positive example too: ;)
>>> - <http://www.w3.org/WAI/demos/bad/after/reports/home.html>
>>>
>>>
>>> Regards,
>>> Shadi
>>>
>>>
>>>> Note: this is a report for a single page but it could still be a basis
>>>> for reports of option #1 for entire websites; it just has a pass/fail
>>>> for each Success Criterion and some Techniques to justify these 
>>>> claims.
>>>>
>>>>
>>>> For option #2 we could introduce a scoring function in addition to the
>>>> pass/fail result. This would require the evaluators to fully evaluate
>>>> every page in the selected sample and count the frequencies of 
>>>> errors to
>>>> calculate a score. It could help compare websites and motivate the
>>>> developers (at least those who are close to full compliance).
>>>>
>>>>
>>>> Finally, option #3 would be more in-depth reports with examples of the
>>>> errors and explanations of ways to repair the errors. These are, as
>>>> Kerstin says, developed by consultants (as opposed to pure evaluators)
>>>> for developers who are new to accessibility.
>>>>
>>>> We attempted to provide such an example report in the initial 
>>>> version of
>>>> the Before and After Demo (BAD) but it is really lots of work:
>>>> - <http://www.w3.org/WAI/EO/2005/Demo/report/>
>>>>
>>>>
>>>> Regards,
>>>> Shadi
>>>>
>>>>
>>>> On 19.2.2012 20:36, Elle wrote:
>>>>> Kerstin:
>>>>>
>>>>> I like these three options. I am interested, however, in how many
>>>>> clients
>>>>> that typically ask for something as abbreviated as Option 1. For
>>>>> those in
>>>>> this group, do you experience situations with a lot of clients who
>>>>> don't
>>>>> want more than the pass/fail report?
>>>>>
>>>>>
>>>>>
>>>>> Regards,
>>>>> Elle
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Sun, Feb 19, 2012 at 4:36 AM, Kerstin Probiesch<
>>>>> k.probiesch@googlemail.com> wrote:
>>>>>
>>>>>> Hi all,
>>>>>>
>>>>>> in our last teleconference we discussed a evaluation scheme with 
>>>>>> three
>>>>>> options based upon 100% Conformance. I appreciate these proposals 
>>>>>> and
>>>>>> see
>>>>>> them as chance to integrate or point to the three documents of 
>>>>>> WCAG2:
>>>>>> Guidelines and SCs, Understanding and How to meet.
>>>>>>
>>>>>> One proposal for handling the documents in an evaluation scheme,
>>>>>> based upon
>>>>>> the normative guidelines and SCs as core:
>>>>>>
>>>>>> =====
>>>>>> Option 1: WCAG 2.0 – Core Test ("light version" or whatever the
>>>>>> wording
>>>>>> later will be)
>>>>>>
>>>>>> # Guideline X (Heading)
>>>>>>
>>>>>> ## Checkpoint: SC XX (Subheading)
>>>>>>
>>>>>> Result: pass/fail
>>>>>>
>>>>>> Character: global/regional (or another wording) - – if regional: a
>>>>>> list of
>>>>>> pages where the problem exists
>>>>>>
>>>>>> ## Checkpoint: SC XX (Subheading)
>>>>>>
>>>>>> Result: pass/fail
>>>>>>
>>>>>> Character: global/regional (or another wording) - – if regional: a
>>>>>> list of
>>>>>> pages where the problem exists
>>>>>>
>>>>>> (...)
>>>>>>
>>>>>> =====
>>>>>>
>>>>>> Use cases for Option1:
>>>>>>
>>>>>> - experienced developers and clients who know WCAG2 and need just 
>>>>>> the
>>>>>> results,
>>>>>> - comparative evaluations (20 hotel websites, city websites…)
>>>>>> - or for example just with the SCs of level a and a smaller scope as
>>>>>> pre-test to decide together with the client what the best next steps
>>>>>> might
>>>>>> be (evaluation, consulting, probably workshops for editors)
>>>>>>
>>>>>> =====
>>>>>>
>>>>>> Option 2: WCAG 2.0 – Core incl. understanding (name?)
>>>>>>
>>>>>> # Guideline X (Heading)
>>>>>>
>>>>>> ## Checkpoint: SC XX (Subheading)
>>>>>>
>>>>>> Result: pass/fail
>>>>>>
>>>>>> Character: global/regional (or another wording) – if regional: a
>>>>>> list of
>>>>>> pages where the problem exists
>>>>>>
>>>>>> Problem (Subheading): Description of existing problems and barriers
>>>>>> for
>>>>>> users (here know how out of the understanding document could be part
>>>>>> of the
>>>>>> description).
>>>>>>
>>>>>> ## Checkpoint: SC XX (Subheading)
>>>>>>
>>>>>> Result: pass/fail
>>>>>>
>>>>>> Character: global/regional (or another wording) – if regional: a
>>>>>> list of
>>>>>> pages where the problem exists
>>>>>>
>>>>>> Problem (Subheading): Description of existing problems and barriers
>>>>>> for
>>>>>> users (here know how out of the understanding document could be part
>>>>>> of the
>>>>>> description).
>>>>>>
>>>>>> (...)
>>>>>>
>>>>>> ======
>>>>>>
>>>>>> Use cases:
>>>>>>
>>>>>> - comparative evaluations (depending on the specific time and costs)
>>>>>>
>>>>>> - if a client just want descriptions
>>>>>>
>>>>>> - regular tests like "evaluation of the week"
>>>>>>
>>>>>> =====
>>>>>>
>>>>>> Option 3: WCAG 2.0 – Core, understanding, how to meet (name?)
>>>>>>
>>>>>> # Guideline X (Heading)
>>>>>>
>>>>>> ## Checkpoint: SC XX (Subheading)
>>>>>>
>>>>>> Result: pass/fail
>>>>>>
>>>>>> Character: global/regional (or another wording) – if regional: a
>>>>>> list of
>>>>>> pages where the problem exists
>>>>>>
>>>>>> Problem (Subheading): description/explanation of existing 
>>>>>> problems and
>>>>>> barriers for users (here know how out of the Understanding Document
>>>>>> could
>>>>>> be
>>>>>> part of the description).
>>>>>>
>>>>>> Action (Subheading): Description of techniques for meeting the SC
>>>>>> (could be
>>>>>> techniques which are already in the techniques document or new
>>>>>> techniques
>>>>>> which are not in the document, but with which the SC can be met).
>>>>>> Here even
>>>>>> usability aspects can play a role, like: you can do a, b, c or d –
>>>>>> I/we
>>>>>> propose/recommend c.
>>>>>>
>>>>>> ## Checkpoint: SC XX (Subheading)
>>>>>>
>>>>>> Result: pass/fail
>>>>>>
>>>>>> Character: global/regional (or another wording) – if regional: a
>>>>>> list of
>>>>>> pages where the problem exists
>>>>>>
>>>>>> Problem (Subheading): description/explanation of existing 
>>>>>> problems and
>>>>>> barriers for users (here know how out of the Understanding Document
>>>>>> could
>>>>>> be
>>>>>> part of the description).
>>>>>>
>>>>>> Action (Subheading): Description of techniques for meeting the SC
>>>>>> (could be
>>>>>> techniques which are already in the techniques document or new
>>>>>> techniques
>>>>>> which are not in the document, but with which the SC can be met).
>>>>>> Here even
>>>>>> usability aspects can play a role, like: you can do a, b, c or d –
>>>>>> I/we
>>>>>> propose/recommend c.
>>>>>>
>>>>>> (...)
>>>>>>
>>>>>> ======
>>>>>>
>>>>>> Use cases:
>>>>>>
>>>>>> - test incl. consulting
>>>>>>
>>>>>> - for clients who are not very familiar with accessibility and WCAG2
>>>>>>
>>>>>> ============
>>>>>>
>>>>>> For a seal/badge or any formal confirmation Option 1 is the minimum.
>>>>>>
>>>>>> A report might also / should? also have intro parts like:
>>>>>>
>>>>>> - Short description of the Option 1, 2 or 3
>>>>>>
>>>>>> - Something like a disclaimer ("results might not be complete,
>>>>>> therefore it
>>>>>> is important to go through the page, view all similar elements and
>>>>>> solve
>>>>>> the
>>>>>> corresponding problems)
>>>>>>
>>>>>> - Glossary (for specific terms we used in our methodology -like
>>>>>> regional/global – if we decide to use them)
>>>>>>
>>>>>> - Documentation of the used OS, Browsers and Versions, probably used
>>>>>> assistive technologies incl. versions
>>>>>>
>>>>>> - Tested Conformance Level (A, AA, AA)
>>>>>>
>>>>>> - Results
>>>>>>
>>>>>> - Summary, probably written as an overall impression - we discussed
>>>>>> in this
>>>>>> list the 'motivation factor'. I think the aim of an evaluation is
>>>>>> not to
>>>>>> motivate. Nevertheless, writing a nice overall impression in a
>>>>>> report, may
>>>>>> have this function. Ok, except when there is nothing nice to say.
>>>>>>
>>>>>> This scheme could probably also be used for processes, pdf, flash 
>>>>>> and
>>>>>> so on
>>>>>> and I think it would be flexible enough (time, costs, ...) and in 
>>>>>> the
>>>>>> same
>>>>>> time valid against the Conformance Requirements, because the core
>>>>>> (evaluation itself) is the same in every option.
>>>>>>
>>>>>> Important, as I see it, is that the evaluator has the three 
>>>>>> different
>>>>>> aspects in mind and in the report, which I believe shouldn't be 
>>>>>> mixed:
>>>>>> evaluation (Core, testing SCs), explanation (description of the
>>>>>> problem/violation, understanding) and consulting (how to meet,
>>>>>> usability,..)
>>>>>>
>>>>>>
>>>>>> The evaluator could document the "progress toward meeting success
>>>>>> criteria
>>>>>> from all levels beyond the achieved level of conformance": If for
>>>>>> example
>>>>>> the evaluation is for Level A with Option 3 the SCs of AA could
>>>>>> also be
>>>>>> checked (pass/fail) without any further description or with further
>>>>>> description, depending on the contract.
>>>>>>
>>>>>> Advantage: every evaluator or testing organization uses the
>>>>>> methodology and
>>>>>> a standardized 'template' for the core and the evaluation itself. 
>>>>>> The
>>>>>> descriptions of existing barriers (explanatory part/understanding in
>>>>>> Option
>>>>>> 2 and 3) and the consulting part (How to meet, in Option 3) would
>>>>>> be the
>>>>>> specific added value for the clients/the evaluator/the testing
>>>>>> organization.
>>>>>>
>>>>>>
>>>>>> Thoughts?
>>>>>>
>>>>>> Best
>>>>>>
>>>>>> --Kerstin
>>>>>>
>>>>>>
>>>>>> -------------------------------------
>>>>>> Kerstin Probiesch - Freie Beraterin
>>>>>> Barrierefreiheit, Social Media, Webkompetenz
>>>>>> Kantstraße 10/19 | 35039 Marburg
>>>>>> Tel.: 06421 167002
>>>>>> E-Mail: mail@barrierefreie-informationskultur.de
>>>>>> Web: http://www.barrierefreie-informationskultur.de
>>>>>>
>>>>>> XING: http://www.xing.com/profile/Kerstin_Probiesch
>>>>>> Twitter: http://twitter.com/kprobiesch
>>>>>> ------------------------------------
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Received on Tuesday, 21 February 2012 17:30:01 UTC