W3C home > Mailing lists > Public > public-wai-evaltf@w3.org > February 2012

Re: Not Applicable (was Re: Evaluation scheme with three options - proposal)

From: Shadi Abou-Zahra <shadi@w3.org>
Date: Tue, 28 Feb 2012 07:57:22 +0100
Message-ID: <4F4C7AD2.4040707@w3.org>
To: Loďc Martínez Normand <loic@fi.upm.es>
CC: "public-wai-evaltf@w3.org" <public-wai-evaltf@w3.org>
Good point, Loic.

I agree that if we are going to introduce the term Not Applicable then 
we will need to define it very carefully and explore the impact, as it 
is a kind of interpretation of WCAG.

Best,
   Shadi


On 27.2.2012 12:54, Loďc Martínez Normand wrote:
> Dear all,
>
> I would like to say a couple of things. First, since long ago, the Sidar
> Foundation uses "not applicable" as one of the individual results for
> evaluating each SC. As many have said it seems more logical as a value for
> customers and (I think) it provides more information that a "pass" with no
> applicable content.
>
> Having said that, I also believe that in WCAG 2.0 that depends on
> individual SC. Some of them have been written so that there is never the
> possibility of having a "N.A" result. And some other are written in
> conditional format, so it is almost "natural" to provide a "N.A" when there
> is not applicable content.
>
> An example of the first group of SC (the ones that will always have just
> pass/fail) is 2.3.1 (three flashes or below threshold). Each web page will
> either pass or fail. There is no way we can say "not applicable".
>
> An example of the second group of SC (the ones with conditional statements)
> is 2.2.1 (timing adjustable). My understanding of the words of 2.2.1 is
> that if there is no time limit then the success criterion does not apply.
>
> In summary, maybe the methodology could list which SC have pass/fail values
> and which others may have pass/fail/NA.
>
> Best regards,
> Loďc
>
> On Fri, Feb 24, 2012 at 3:13 AM, Elle<nethermind@gmail.com>  wrote:
>
>> > From a client perspective, I would rather see "Pass/Fail/Not Applicable"
>> for the exact reasons that Vivienne describes.  We do indeed use snapshots
>> drawn in easily digestible pictures for leadership. It's important to
>> understand the true measure of the capability of your development teams to
>> meet accessibility requirements, and having false positives (so to speak)
>> inflates that score and prevents us from seeing the severity of the
>> non-conformance/risk.
>>
>> Respectfully,
>> Elle
>>
>>
>>
>> On Thu, Feb 23, 2012 at 10:00 AM, Boland Jr, Frederick E.<
>> frederick.boland@nist.gov>  wrote:
>>
>>> Fyi - The latest W3C WAI ATAG2.0 draft success criteria satisfaction
>>> options for conformance are: yes, no, not applicable:
>>> http://www.w3.org/WAI/AU/2012/ED-ATAG20-20120210/#conf-req
>>>
>>> Thanks Tim Boland NIST
>>>
>>> -----Original Message-----
>>> From: Detlev Fischer [mailto:fischer@dias.de]
>>> Sent: Thursday, February 23, 2012 9:49 AM
>>> To: public-wai-evaltf@w3.org
>>> Subject: Re: Not Applicable (was Re: Evaluation scheme with three options
>>> - proposal)
>>>
>>> I agree that the use of N.A. has practical advantages and that not using
>>> it will be confusing to many people not into testing technicalities.
>>> It's just a tradeoff whether we want to upset or run counter to a WCAG
>>> WG decision that was seemingly made after lengthy deliberation some time
>>> ago.
>>>
>>> Regarding techniques vs SC as checkpoints, I do think that checkpoints
>>> should be more detailed then, say SC 1.1.1 oe SC 1.3.1 which combine an
>>> awful lot of different requirements.
>>>
>>> When I said in the last telecon that techniques themselves should not be
>>> used as checkpoints, I intended to say that the level of technique is
>>> (often) too fine-grained for a checkpoint, especially since several
>>> techniques may be applicable and used to meet a specific SC and in
>>> practical terms, the check may cover several of them at the same time.
>>>
>>> For example, looking at alternative text, you would look at all images
>>> on a page and determine whether the alt text should give the purpose or
>>> destination of linked images, the content of unlinked images, or be
>>> empty for decorative images. For all these things, different techniques
>>> exist, but in terms of checkpoint procedure, you just run through all
>>> images and check for each one that the appropriate technique has been
>>> used.
>>>
>>> Regards,
>>> Detlev
>>>
>>>
>>> Am 23.02.2012 12:02, schrieb Kerstin Probiesch:
>>>> +1 for what Richards says :-)
>>>>
>>>>> -----Ursprüngliche Nachricht-----
>>>>> Von: RichardWarren [mailto:richard.warren@userite.com]
>>>>> Gesendet: Donnerstag, 23. Februar 2012 11:53
>>>>> An: Vivienne CONWAY; Eval TF
>>>>> Betreff: Re: Not Applicable (was Re: Evaluation scheme with three
>>>>> options - proposal)
>>>>>
>>>>> Hi Vivienne and all.
>>>>>
>>>>> When I use the "pass" term for things that do not exist I get queries
>>>>> from
>>>>> my clients who think that I have not done the job properly.
>>>>>
>>>>> To save us all the hassle of explaining the complexity of W3C logic,  I
>>>>> therefore use N/A for SCs and Guideline 1.2.  on my score sheet.  At
>>>>> the top
>>>>> of my score sheet I explain the meanings of Pass, Fail and N/A
>>>>> including
>>>>> that we give a pass score value to N/A  because the designer has
>>>>> avoided
>>>>> using the relevant technologies that can cause particular problems.
>>>>> This
>>>>> means the in the executive summary I can count N/As as Pass and avoid
>>>>> the
>>>>> hassle. (eg. "The site passes all 12 guidelines")
>>>>>
>>>>> Richard
>>>>>
>>>>> -----Original Message-----
>>>>> From: Vivienne CONWAY
>>>>> Sent: Thursday, February 23, 2012 9:09 AM
>>>>> To: Shadi Abou-Zahra
>>>>> Cc: Eval TF
>>>>> Subject: RE: Not Applicable (was Re: Evaluation scheme with three
>>>>> options -
>>>>> proposal)
>>>>>
>>>>> HI Shadi&   all
>>>>>
>>>>> That's true, however where it comes unstuck is in the reporting.  CEO's
>>>>> like
>>>>> to see lovely graphs showing how many of the criteria have been met.
>>>>> If the
>>>>> data is recorded as 'passed' even though it doesn't exist (the audio
>>>>> file),
>>>>> then the graph and any stats look better than they are really are.  If
>>>>> it is
>>>>> n/a or something similar, they can't confuse this with something that
>>>>> actually meets a requirement.
>>>>>
>>>>>
>>>>> Regards
>>>>>
>>>>> Vivienne L. Conway, B.IT(Hons), MACS CT
>>>>> PhD Candidate&   Sessional Lecturer, Edith Cowan University, Perth, W.A.
>>>>> Director, Web Key IT Pty Ltd.
>>>>> v.conway@ecu.edu.au
>>>>> v.conway@webkeyit.com
>>>>> Mob: 0415 383 673
>>>>>
>>>>> This email is confidential and intended only for the use of the
>>>>> individual
>>>>> or entity named above. If you are not the intended recipient, you are
>>>>> notified that any dissemination, distribution or copying of this email
>>>>> is
>>>>> strictly prohibited. If you have received this email in error, please
>>>>> notify
>>>>> me immediately by return email or telephone and destroy the original
>>>>> message.
>>>>> ________________________________________
>>>>> From: Shadi Abou-Zahra [shadi@w3.org]
>>>>> Sent: Thursday, 23 February 2012 6:02 PM
>>>>> To: Vivienne CONWAY
>>>>> Cc: Eval TF
>>>>> Subject: Re: Not Applicable (was Re: Evaluation scheme with three
>>>>> options -
>>>>> proposal)
>>>>>
>>>>> Hi Vivienne,
>>>>>
>>>>> Isn't it the same if you called it "passed" or "not applicable" then
>>>>> the
>>>>> content is added? In both cases your report is already out of date and
>>>>> the content needs to be reassessed.
>>>>>
>>>>> Note that even in this most basic report it is pretty clear when the
>>>>> content passed because there was no corresponding content:
>>>>>     -<http://www.w3.org/WAI/demos/bad/before/reports/home.html>
>>>>>
>>>>> Best,
>>>>>      Shadi
>>>>>
>>>>>
>>>>> On 23.2.2012 09:47, Vivienne CONWAY wrote:
>>>>>> HI Alistair
>>>>>> I agree with you on this one for sure.
>>>>>>
>>>>>> If I can't find an issue (say an audio file) that does not
>>>>> necessarily
>>>>>> mean that the website should pass on that criteria. It may be added
>>>>> later,
>>>>>> or I may just not have located it when looking for suitable pages to
>>>>>> assess.  I'm more comfortable with 'not applicable' or 'not tested'
>>>>> or
>>>>>> 'not located on reviewed pages' or something similar.
>>>>>>
>>>>>>
>>>>>> Regards
>>>>>>
>>>>>> Vivienne L. Conway, B.IT(Hons), MACS CT
>>>>>> PhD Candidate&    Sessional Lecturer, Edith Cowan University, Perth,
>>>>> W.A.
>>>>>> Director, Web Key IT Pty Ltd.
>>>>>> v.conway@ecu.edu.au<mailto:v.conway@ecu.edu.au>
>>>>>> v.conway@webkeyit.com<mailto:v.conway@webkeyit.com>
>>>>>> Mob: 0415 383 673
>>>>>>
>>>>>> This email is confidential and intended only for the use of the
>>>>> individual
>>>>>> or entity named above. If you are not the intended recipient, you are
>>>>>> notified that any dissemination, distribution or copying of this
>>>>> email is
>>>>>> strictly prohibited. If you have received this email in error, please
>>>>>> notify me immediately by return email or telephone and destroy the
>>>>>> original message.
>>>>>>
>>>>>> ________________________________
>>>>>> From: Alistair Garrison [alistair.j.garrison@gmail.com]
>>>>>> Sent: Thursday, 23 February 2012 5:43 PM
>>>>>> To: Eval TF
>>>>>> Subject: Re: Not Applicable (was Re: Evaluation scheme with three
>>>>>> options - proposal)
>>>>>>
>>>>>> Hi All,
>>>>>>
>>>>>> I would feel a little uncomfortable declaring something to be passed
>>>>> -
>>>>>> simply because I could not find any applicable content.
>>>>>>
>>>>>> I know when showing conformity with ISO documentation "reference to
>>>>> the
>>>>>> appropriate Standard (title, number, and year of issue); when the
>>>>>> certification applies to only a Portion of a standard, the applicable
>>>>>> portion(s) should be clearly identified;" (ISO Guide 23).
>>>>>>
>>>>>> In our situation, this would mean that we simply list all things
>>>>> (most
>>>>>> probably Success Criteria) we have found to be applicable in the
>>>>>> Conformance Claim.  Then go on to state which of these things has
>>>>> been
>>>>>> passed or failed in the report.
>>>>>>
>>>>>> Hope this helps.
>>>>>>
>>>>>> All the best
>>>>>>
>>>>>> Alistair
>>>>>>
>>>>>> On 22 Feb 2012, at 12:37, Emmanuelle Gutiérrez y Restrepo wrote:
>>>>>>
>>>>>> My +1 too :-)
>>>>>>
>>>>>> I think that this is very important.
>>>>>>
>>>>>> Regards,
>>>>>> Emmanuelle
>>>>>>
>>>>>> 2012/2/22 Velleman,
>>>>>> Eric<evelleman@bartimeus.nl<mailto:evelleman@bartimeus.nl>>
>>>>>> Hi Vivienne,
>>>>>>
>>>>>> Think I put it on the agenda. So lets talk about it.
>>>>>> Kindest regards,
>>>>>>
>>>>>> Eric
>>>>>>
>>>>>>
>>>>>>
>>>>>> ________________________________________
>>>>>> Van: Vivienne CONWAY
>>>>> [v.conway@ecu.edu.au<mailto:v.conway@ecu.edu.au>]
>>>>>> Verzonden: woensdag 22 februari 2012 3:13
>>>>>> Aan: Michael S Elledge; Shadi Abou-Zahra
>>>>>> CC: public-wai-evaltf@w3.org<mailto:public-wai-evaltf@w3.org>
>>>>>> Onderwerp: RE: Not Applicable (was Re: Evaluation scheme with three
>>>>>> options  -  proposal)
>>>>>>
>>>>>> Hi all
>>>>>>
>>>>>> I just ran across this discussion which is something that I think we
>>>>>> should put in the EVTF methodology.  I know that I've been using n/a
>>>>> when
>>>>>> it seemed the item was not present in the website e.g. videos.
>>>>> However if
>>>>>> this is the W3C consensus, I'll need to change my reporting.  Can we
>>>>> talk
>>>>>> about this in our teleconference this week?
>>>>>>
>>>>>>
>>>>>> Regards
>>>>>>
>>>>>> Vivienne L. Conway, B.IT<http://B.IT/>(Hons), MACS CT
>>>>>> PhD Candidate&    Sessional Lecturer, Edith Cowan University, Perth,
>>>>> W.A.
>>>>>> Director, Web Key IT Pty Ltd.
>>>>>> v.conway@ecu.edu.au<mailto:v.conway@ecu.edu.au>
>>>>>> v.conway@webkeyit.com<mailto:v.conway@webkeyit.com>
>>>>>> Mob: 0415 383 673
>>>>>>
>>>>>> This email is confidential and intended only for the use of the
>>>>> individual
>>>>>> or entity named above. If you are not the intended recipient, you are
>>>>>> notified that any dissemination, distribution or copying of this
>>>>> email is
>>>>>> strictly prohibited. If you have received this email in error, please
>>>>>> notify me immediately by return email or telephone and destroy the
>>>>>> original message.
>>>>>> ________________________________________
>>>>>> From: Michael S Elledge [elledge@msu.edu<mailto:elledge@msu.edu>]
>>>>>> Sent: Wednesday, 22 February 2012 2:29 AM
>>>>>> To: Shadi Abou-Zahra
>>>>>> Cc: public-wai-evaltf@w3.org<mailto:public-wai-evaltf@w3.org>
>>>>>> Subject: Re: Not Applicable (was Re: Evaluation scheme with three
>>>>>> ptions  - proposal)
>>>>>>
>>>>>> Thanks for the explanation, Shadi. I imagine it took some discussion
>>>>> to
>>>>>> reach that consensus!  :^)
>>>>>>
>>>>>> Mike
>>>>>>
>>>>>> On 2/20/2012 2:30 PM, Shadi Abou-Zahra wrote:
>>>>>>> Hi Mike,
>>>>>>>
>>>>>>> Good question. We had a long discussion about that and also asked
>>>>> the
>>>>>>> WCAG Working Group on their position on this.
>>>>>>>
>>>>>>> According to WCAG WG, the term "Not Applicable" is not defined and
>>>>> is
>>>>>>> ambiguous. Accessibility requirements are deemed met when the
>>>>> content
>>>>>>> does not require specific accessibility features. For example, the
>>>>>>> requirement for captioning is deemed met if there is no video
>>>>> content.
>>>>>>>
>>>>>>> I will try dig out the corresponding pointers but I recall that this
>>>>>>> was something that was less clearly documented in the WCAG
>>>>> documents.
>>>>>>> We will probably need to clarify this point somewhere in section 5
>>>>> of
>>>>>>> the Methodology, and possibly as WCAG WG to also clarify some of
>>>>> their
>>>>>>> materials (so that we can refer to it from our explanation).
>>>>>>>
>>>>>>> Best,
>>>>>>>      Shadi
>>>>>>>
>>>>>>>
>>>>>>> On 20.2.2012 19:26, Michael S Elledge wrote:
>>>>>>>> Hi Shadi--
>>>>>>>>
>>>>>>>> I noticed in the BAD example that success criteria for which there
>>>>> was
>>>>>>>> no related web content received a "Pass." I'm curious why that
>>>>> approach
>>>>>>>> was chosen rather than to identify such instances as "Not
>>>>> Applicable" or
>>>>>>>> "NA." Wouldn't using the term "NA" be both more informative and
>>>>>>>> accurate?
>>>>>>>>
>>>>>>>> Mike
>>>>>>>>
>>>>>>>> On 2/20/2012 10:43 AM, Shadi Abou-Zahra wrote:
>>>>>>>>> Small addition:
>>>>>>>>>
>>>>>>>>> On 20.2.2012 16:28, Shadi Abou-Zahra wrote:
>>>>>>>>>> Hi Kerstin, All,
>>>>>>>>>>
>>>>>>>>>> I'm not too sure what the difference between options #1 and #2
>>>>>>>>>> would be
>>>>>>>>>> in practice, as I hope that evaluators will simply link to
>>>>> Techniques
>>>>>>>>>> rather than to attempt to explain the issues themselves.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Here is an example of what a report of option #1 could look like:
>>>>>>>>>> -<http://www.w3.org/WAI/demos/bad/before/reports/home.html>
>>>>>>>>>
>>>>>>>>> Here is a positive example too: ;)
>>>>>>>>> -<http://www.w3.org/WAI/demos/bad/after/reports/home.html>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Regards,
>>>>>>>>> Shadi
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>> Note: this is a report for a single page but it could still be a
>>>>> basis
>>>>>>>>>> for reports of option #1 for entire websites; it just has a
>>>>> pass/fail
>>>>>>>>>> for each Success Criterion and some Techniques to justify these
>>>>>>>>>> claims.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> For option #2 we could introduce a scoring function in addition
>>>>> to the
>>>>>>>>>> pass/fail result. This would require the evaluators to fully
>>>>> evaluate
>>>>>>>>>> every page in the selected sample and count the frequencies of
>>>>>>>>>> errors to
>>>>>>>>>> calculate a score. It could help compare websites and motivate
>>>>> the
>>>>>>>>>> developers (at least those who are close to full compliance).
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Finally, option #3 would be more in-depth reports with examples
>>>>> of the
>>>>>>>>>> errors and explanations of ways to repair the errors. These are,
>>>>> as
>>>>>>>>>> Kerstin says, developed by consultants (as opposed to pure
>>>>> evaluators)
>>>>>>>>>> for developers who are new to accessibility.
>>>>>>>>>>
>>>>>>>>>> We attempted to provide such an example report in the initial
>>>>>>>>>> version of
>>>>>>>>>> the Before and After Demo (BAD) but it is really lots of work:
>>>>>>>>>> -<http://www.w3.org/WAI/EO/2005/Demo/report/>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Regards,
>>>>>>>>>> Shadi
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On 19.2.2012 20:36, Elle wrote:
>>>>>>>>>>> Kerstin:
>>>>>>>>>>>
>>>>>>>>>>> I like these three options. I am interested, however, in how
>>>>> many
>>>>>>>>>>> clients
>>>>>>>>>>> that typically ask for something as abbreviated as Option 1. For
>>>>>>>>>>> those in
>>>>>>>>>>> this group, do you experience situations with a lot of clients
>>>>> who
>>>>>>>>>>> don't
>>>>>>>>>>> want more than the pass/fail report?
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> Regards,
>>>>>>>>>>> Elle
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Feb 19, 2012 at 4:36 AM, Kerstin Probiesch<
>>>>>>>>>>> k.probiesch@googlemail.com<mailto:k.probiesch@googlemail.com>>
>>>>>>>>>>> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Hi all,
>>>>>>>>>>>>
>>>>>>>>>>>> in our last teleconference we discussed a evaluation scheme
>>>>> with
>>>>>>>>>>>> three
>>>>>>>>>>>> options based upon 100% Conformance. I appreciate these
>>>>> proposals
>>>>>>>>>>>> and
>>>>>>>>>>>> see
>>>>>>>>>>>> them as chance to integrate or point to the three documents of
>>>>>>>>>>>> WCAG2:
>>>>>>>>>>>> Guidelines and SCs, Understanding and How to meet.
>>>>>>>>>>>>
>>>>>>>>>>>> One proposal for handling the documents in an evaluation
>>>>> scheme,
>>>>>>>>>>>> based upon
>>>>>>>>>>>> the normative guidelines and SCs as core:
>>>>>>>>>>>>
>>>>>>>>>>>> =====
>>>>>>>>>>>> Option 1: WCAG 2.0 - Core Test ("light version" or whatever the
>>>>>>>>>>>> wording
>>>>>>>>>>>> later will be)
>>>>>>>>>>>>
>>>>>>>>>>>> # Guideline X (Heading)
>>>>>>>>>>>>
>>>>>>>>>>>> ## Checkpoint: SC XX (Subheading)
>>>>>>>>>>>>
>>>>>>>>>>>> Result: pass/fail
>>>>>>>>>>>>
>>>>>>>>>>>> Character: global/regional (or another wording) - - if
>>>>> regional: a
>>>>>>>>>>>> list of
>>>>>>>>>>>> pages where the problem exists
>>>>>>>>>>>>
>>>>>>>>>>>> ## Checkpoint: SC XX (Subheading)
>>>>>>>>>>>>
>>>>>>>>>>>> Result: pass/fail
>>>>>>>>>>>>
>>>>>>>>>>>> Character: global/regional (or another wording) - - if
>>>>> regional: a
>>>>>>>>>>>> list of
>>>>>>>>>>>> pages where the problem exists
>>>>>>>>>>>>
>>>>>>>>>>>> (...)
>>>>>>>>>>>>
>>>>>>>>>>>> =====
>>>>>>>>>>>>
>>>>>>>>>>>> Use cases for Option1:
>>>>>>>>>>>>
>>>>>>>>>>>> - experienced developers and clients who know WCAG2 and need
>>>>> just
>>>>>>>>>>>> the
>>>>>>>>>>>> results,
>>>>>>>>>>>> - comparative evaluations (20 hotel websites, city websites...)
>>>>>>>>>>>> - or for example just with the SCs of level a and a smaller
>>>>> scope as
>>>>>>>>>>>> pre-test to decide together with the client what the best next
>>>>> steps
>>>>>>>>>>>> might
>>>>>>>>>>>> be (evaluation, consulting, probably workshops for editors)
>>>>>>>>>>>>
>>>>>>>>>>>> =====
>>>>>>>>>>>>
>>>>>>>>>>>> Option 2: WCAG 2.0 - Core incl. understanding (name?)
>>>>>>>>>>>>
>>>>>>>>>>>> # Guideline X (Heading)
>>>>>>>>>>>>
>>>>>>>>>>>> ## Checkpoint: SC XX (Subheading)
>>>>>>>>>>>>
>>>>>>>>>>>> Result: pass/fail
>>>>>>>>>>>>
>>>>>>>>>>>> Character: global/regional (or another wording) - if regional:
>>>>> a
>>>>>>>>>>>> list of
>>>>>>>>>>>> pages where the problem exists
>>>>>>>>>>>>
>>>>>>>>>>>> Problem (Subheading): Description of existing problems and
>>>>> barriers
>>>>>>>>>>>> for
>>>>>>>>>>>> users (here know how out of the understanding document could be
>>>>> part
>>>>>>>>>>>> of the
>>>>>>>>>>>> description).
>>>>>>>>>>>>
>>>>>>>>>>>> ## Checkpoint: SC XX (Subheading)
>>>>>>>>>>>>
>>>>>>>>>>>> Result: pass/fail
>>>>>>>>>>>>
>>>>>>>>>>>> Character: global/regional (or another wording) - if regional:
>>>>> a
>>>>>>>>>>>> list of
>>>>>>>>>>>> pages where the problem exists
>>>>>>>>>>>>
>>>>>>>>>>>> Problem (Subheading): Description of existing problems and
>>>>> barriers
>>>>>>>>>>>> for
>>>>>>>>>>>> users (here know how out of the understanding document could be
>>>>> part
>>>>>>>>>>>> of the
>>>>>>>>>>>> description).
>>>>>>>>>>>>
>>>>>>>>>>>> (...)
>>>>>>>>>>>>
>>>>>>>>>>>> ======
>>>>>>>>>>>>
>>>>>>>>>>>> Use cases:
>>>>>>>>>>>>
>>>>>>>>>>>> - comparative evaluations (depending on the specific time and
>>>>> costs)
>>>>>>>>>>>>
>>>>>>>>>>>> - if a client just want descriptions
>>>>>>>>>>>>
>>>>>>>>>>>> - regular tests like "evaluation of the week"
>>>>>>>>>>>>
>>>>>>>>>>>> =====
>>>>>>>>>>>>
>>>>>>>>>>>> Option 3: WCAG 2.0 - Core, understanding, how to meet (name?)
>>>>>>>>>>>>
>>>>>>>>>>>> # Guideline X (Heading)
>>>>>>>>>>>>
>>>>>>>>>>>> ## Checkpoint: SC XX (Subheading)
>>>>>>>>>>>>
>>>>>>>>>>>> Result: pass/fail
>>>>>>>>>>>>
>>>>>>>>>>>> Character: global/regional (or another wording) - if regional:
>>>>> a
>>>>>>>>>>>> list of
>>>>>>>>>>>> pages where the problem exists
>>>>>>>>>>>>
>>>>>>>>>>>> Problem (Subheading): description/explanation of existing
>>>>>>>>>>>> problems and
>>>>>>>>>>>> barriers for users (here know how out of the Understanding
>>>>> Document
>>>>>>>>>>>> could
>>>>>>>>>>>> be
>>>>>>>>>>>> part of the description).
>>>>>>>>>>>>
>>>>>>>>>>>> Action (Subheading): Description of techniques for meeting the
>>>>> SC
>>>>>>>>>>>> (could be
>>>>>>>>>>>> techniques which are already in the techniques document or new
>>>>>>>>>>>> techniques
>>>>>>>>>>>> which are not in the document, but with which the SC can be
>>>>> met).
>>>>>>>>>>>> Here even
>>>>>>>>>>>> usability aspects can play a role, like: you can do a, b, c or
>>>>> d -
>>>>>>>>>>>> I/we
>>>>>>>>>>>> propose/recommend c.
>>>>>>>>>>>>
>>>>>>>>>>>> ## Checkpoint: SC XX (Subheading)
>>>>>>>>>>>>
>>>>>>>>>>>> Result: pass/fail
>>>>>>>>>>>>
>>>>>>>>>>>> Character: global/regional (or another wording) - if regional:
>>>>> a
>>>>>>>>>>>> list of
>>>>>>>>>>>> pages where the problem exists
>>>>>>>>>>>>
>>>>>>>>>>>> Problem (Subheading): description/explanation of existing
>>>>>>>>>>>> problems and
>>>>>>>>>>>> barriers for users (here know how out of the Understanding
>>>>> Document
>>>>>>>>>>>> could
>>>>>>>>>>>> be
>>>>>>>>>>>> part of the description).
>>>>>>>>>>>>
>>>>>>>>>>>> Action (Subheading): Description of techniques for meeting the
>>>>> SC
>>>>>>>>>>>> (could be
>>>>>>>>>>>> techniques which are already in the techniques document or new
>>>>>>>>>>>> techniques
>>>>>>>>>>>> which are not in the document, but with which the SC can be
>>>>> met).
>>>>>>>>>>>> Here even
>>>>>>>>>>>> usability aspects can play a role, like: you can do a, b, c or
>>>>> d -
>>>>>>>>>>>> I/we
>>>>>>>>>>>> propose/recommend c.
>>>>>>>>>>>>
>>>>>>>>>>>> (...)
>>>>>>>>>>>>
>>>>>>>>>>>> ======
>>>>>>>>>>>>
>>>>>>>>>>>> Use cases:
>>>>>>>>>>>>
>>>>>>>>>>>> - test incl. consulting
>>>>>>>>>>>>
>>>>>>>>>>>> - for clients who are not very familiar with accessibility and
>>>>> WCAG2
>>>>>>>>>>>>
>>>>>>>>>>>> ============
>>>>>>>>>>>>
>>>>>>>>>>>> For a seal/badge or any formal confirmation Option 1 is the
>>>>> minimum.
>>>>>>>>>>>>
>>>>>>>>>>>> A report might also / should? also have intro parts like:
>>>>>>>>>>>>
>>>>>>>>>>>> - Short description of the Option 1, 2 or 3
>>>>>>>>>>>>
>>>>>>>>>>>> - Something like a disclaimer ("results might not be complete,
>>>>>>>>>>>> therefore it
>>>>>>>>>>>> is important to go through the page, view all similar elements
>>>>> and
>>>>>>>>>>>> solve
>>>>>>>>>>>> the
>>>>>>>>>>>> corresponding problems)
>>>>>>>>>>>>
>>>>>>>>>>>> - Glossary (for specific terms we used in our methodology -like
>>>>>>>>>>>> regional/global - if we decide to use them)
>>>>>>>>>>>>
>>>>>>>>>>>> - Documentation of the used OS, Browsers and Versions, probably
>>>>> used
>>>>>>>>>>>> assistive technologies incl. versions
>>>>>>>>>>>>
>>>>>>>>>>>> - Tested Conformance Level (A, AA, AA)
>>>>>>>>>>>>
>>>>>>>>>>>> - Results
>>>>>>>>>>>>
>>>>>>>>>>>> - Summary, probably written as an overall impression - we
>>>>> discussed
>>>>>>>>>>>> in this
>>>>>>>>>>>> list the 'motivation factor'. I think the aim of an evaluation
>>>>> is
>>>>>>>>>>>> not to
>>>>>>>>>>>> motivate. Nevertheless, writing a nice overall impression in a
>>>>>>>>>>>> report, may
>>>>>>>>>>>> have this function. Ok, except when there is nothing nice to
>>>>> say.
>>>>>>>>>>>>
>>>>>>>>>>>> This scheme could probably also be used for processes, pdf,
>>>>> flash
>>>>>>>>>>>> and
>>>>>>>>>>>> so on
>>>>>>>>>>>> and I think it would be flexible enough (time, costs, ...) and
>>>>> in
>>>>>>>>>>>> the
>>>>>>>>>>>> same
>>>>>>>>>>>> time valid against the Conformance Requirements, because the
>>>>> core
>>>>>>>>>>>> (evaluation itself) is the same in every option.
>>>>>>>>>>>>
>>>>>>>>>>>> Important, as I see it, is that the evaluator has the three
>>>>>>>>>>>> different
>>>>>>>>>>>> aspects in mind and in the report, which I believe shouldn't be
>>>>>>>>>>>> mixed:
>>>>>>>>>>>> evaluation (Core, testing SCs), explanation (description of the
>>>>>>>>>>>> problem/violation, understanding) and consulting (how to meet,
>>>>>>>>>>>> usability,..)
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> The evaluator could document the "progress toward meeting
>>>>> success
>>>>>>>>>>>> criteria
>>>>>>>>>>>> from all levels beyond the achieved level of conformance": If
>>>>> for
>>>>>>>>>>>> example
>>>>>>>>>>>> the evaluation is for Level A with Option 3 the SCs of AA could
>>>>>>>>>>>> also be
>>>>>>>>>>>> checked (pass/fail) without any further description or with
>>>>> further
>>>>>>>>>>>> description, depending on the contract.
>>>>>>>>>>>>
>>>>>>>>>>>> Advantage: every evaluator or testing organization uses the
>>>>>>>>>>>> methodology and
>>>>>>>>>>>> a standardized 'template' for the core and the evaluation
>>>>> itself.
>>>>>>>>>>>> The
>>>>>>>>>>>> descriptions of existing barriers (explanatory
>>>>> part/understanding in
>>>>>>>>>>>> Option
>>>>>>>>>>>> 2 and 3) and the consulting part (How to meet, in Option 3)
>>>>> would
>>>>>>>>>>>> be the
>>>>>>>>>>>> specific added value for the clients/the evaluator/the testing
>>>>>>>>>>>> organization.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> Thoughts?
>>>>>>>>>>>>
>>>>>>>>>>>> Best
>>>>>>>>>>>>
>>>>>>>>>>>> --Kerstin
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>
>>>
>

-- 
Shadi Abou-Zahra - http://www.w3.org/People/shadi/
Activity Lead, W3C/WAI International Program Office
Evaluation and Repair Tools Working Group (ERT WG)
Research and Development Working Group (RDWG)
Received on Tuesday, 28 February 2012 06:58:00 GMT

This archive was generated by hypermail 2.3.1 : Friday, 8 March 2013 15:52:13 GMT