AW: Not Applicable (was Re: Evaluation scheme with three options - proposal)

Hi all,

I'm ok with "not applicable" and with "not located on reviewed pages" and
would prefer the second one. "not tested" might be a bit confusing.

Regs

--Kerstin

> -----Ursprüngliche Nachricht-----
> Von: Vivienne CONWAY [mailto:v.conway@ecu.edu.au]
> Gesendet: Donnerstag, 23. Februar 2012 09:47
> An: Alistair Garrison; Eval TF
> Betreff: RE: Not Applicable (was Re: Evaluation scheme with three
> options - proposal)
> 
> HI Alistair
> I agree with you on this one for sure.
> 
> If I can't find an issue (say an audio file) that does not necessarily
> mean that the website should pass on that criteria. It may be added
> later, or I may just not have located it when looking for suitable
> pages to assess.  I'm more comfortable with 'not applicable' or 'not
> tested' or 'not located on reviewed pages' or something similar.
> 
> 
> Regards
> 
> Vivienne L. Conway, B.IT(Hons), MACS CT
> PhD Candidate & Sessional Lecturer, Edith Cowan University, Perth, W.A.
> Director, Web Key IT Pty Ltd.
> v.conway@ecu.edu.au<mailto:v.conway@ecu.edu.au>
> v.conway@webkeyit.com<mailto:v.conway@webkeyit.com>
> Mob: 0415 383 673
> 
> This email is confidential and intended only for the use of the
> individual or entity named above. If you are not the intended
> recipient, you are notified that any dissemination, distribution or
> copying of this email is strictly prohibited. If you have received this
> email in error, please notify me immediately by return email or
> telephone and destroy the original message.
> 
> ________________________________
> From: Alistair Garrison [alistair.j.garrison@gmail.com]
> Sent: Thursday, 23 February 2012 5:43 PM
> To: Eval TF
> Subject: Re: Not Applicable (was Re: Evaluation scheme with three
> options - proposal)
> 
> Hi All,
> 
> I would feel a little uncomfortable declaring something to be passed -
> simply because I could not find any applicable content.
> 
> I know when showing conformity with ISO documentation "reference to the
> appropriate Standard (title, number, and year of issue); when the
> certification applies to only a Portion of a standard, the applicable
> portion(s) should be clearly identified;" (ISO Guide 23).
> 
> In our situation, this would mean that we simply list all things (most
> probably Success Criteria) we have found to be applicable in the
> Conformance Claim.  Then go on to state which of these things has been
> passed or failed in the report.
> 
> Hope this helps.
> 
> All the best
> 
> Alistair
> 
> On 22 Feb 2012, at 12:37, Emmanuelle Gutiérrez y Restrepo wrote:
> 
> My +1 too :-)
> 
> I think that this is very important.
> 
> Regards,
> Emmanuelle
> 
> 2012/2/22 Velleman, Eric
> <evelleman@bartimeus.nl<mailto:evelleman@bartimeus.nl>>
> Hi Vivienne,
> 
> Think I put it on the agenda. So lets talk about it.
> Kindest regards,
> 
> Eric
> 
> 
> 
> ________________________________________
> Van: Vivienne CONWAY [v.conway@ecu.edu.au<mailto:v.conway@ecu.edu.au>]
> Verzonden: woensdag 22 februari 2012 3:13
> Aan: Michael S Elledge; Shadi Abou-Zahra
> CC: public-wai-evaltf@w3.org<mailto:public-wai-evaltf@w3.org>
> Onderwerp: RE: Not Applicable (was Re: Evaluation scheme with three
> options  -  proposal)
> 
> Hi all
> 
> I just ran across this discussion which is something that I think we
> should put in the EVTF methodology.  I know that I've been using n/a
> when it seemed the item was not present in the website e.g. videos.
> However if this is the W3C consensus, I'll need to change my reporting.
> Can we talk about this in our teleconference this week?
> 
> 
> Regards
> 
> Vivienne L. Conway, B.IT<http://B.IT/>(Hons), MACS CT
> PhD Candidate & Sessional Lecturer, Edith Cowan University, Perth, W.A.
> Director, Web Key IT Pty Ltd.
> v.conway@ecu.edu.au<mailto:v.conway@ecu.edu.au>
> v.conway@webkeyit.com<mailto:v.conway@webkeyit.com>
> Mob: 0415 383 673
> 
> This email is confidential and intended only for the use of the
> individual or entity named above. If you are not the intended
> recipient, you are notified that any dissemination, distribution or
> copying of this email is strictly prohibited. If you have received this
> email in error, please notify me immediately by return email or
> telephone and destroy the original message.
> ________________________________________
> From: Michael S Elledge [elledge@msu.edu<mailto:elledge@msu.edu>]
> Sent: Wednesday, 22 February 2012 2:29 AM
> To: Shadi Abou-Zahra
> Cc: public-wai-evaltf@w3.org<mailto:public-wai-evaltf@w3.org>
> Subject: Re: Not Applicable (was Re: Evaluation scheme with three
> options  - proposal)
> 
> Thanks for the explanation, Shadi. I imagine it took some discussion to
> reach that consensus!  :^)
> 
> Mike
> 
> On 2/20/2012 2:30 PM, Shadi Abou-Zahra wrote:
> > Hi Mike,
> >
> > Good question. We had a long discussion about that and also asked the
> > WCAG Working Group on their position on this.
> >
> > According to WCAG WG, the term "Not Applicable" is not defined and is
> > ambiguous. Accessibility requirements are deemed met when the content
> > does not require specific accessibility features. For example, the
> > requirement for captioning is deemed met if there is no video
> content.
> >
> > I will try dig out the corresponding pointers but I recall that this
> > was something that was less clearly documented in the WCAG documents.
> > We will probably need to clarify this point somewhere in section 5 of
> > the Methodology, and possibly as WCAG WG to also clarify some of
> their
> > materials (so that we can refer to it from our explanation).
> >
> > Best,
> >   Shadi
> >
> >
> > On 20.2.2012 19:26, Michael S Elledge wrote:
> >> Hi Shadi--
> >>
> >> I noticed in the BAD example that success criteria for which there
> was
> >> no related web content received a "Pass." I'm curious why that
> approach
> >> was chosen rather than to identify such instances as "Not
> Applicable" or
> >> "NA." Wouldn't using the term "NA" be both more informative and
> >> accurate?
> >>
> >> Mike
> >>
> >> On 2/20/2012 10:43 AM, Shadi Abou-Zahra wrote:
> >>> Small addition:
> >>>
> >>> On 20.2.2012 16:28, Shadi Abou-Zahra wrote:
> >>>> Hi Kerstin, All,
> >>>>
> >>>> I'm not too sure what the difference between options #1 and #2
> >>>> would be
> >>>> in practice, as I hope that evaluators will simply link to
> Techniques
> >>>> rather than to attempt to explain the issues themselves.
> >>>>
> >>>>
> >>>> Here is an example of what a report of option #1 could look like:
> >>>> - <http://www.w3.org/WAI/demos/bad/before/reports/home.html>
> >>>
> >>> Here is a positive example too: ;)
> >>> - <http://www.w3.org/WAI/demos/bad/after/reports/home.html>
> >>>
> >>>
> >>> Regards,
> >>> Shadi
> >>>
> >>>
> >>>> Note: this is a report for a single page but it could still be a
> basis
> >>>> for reports of option #1 for entire websites; it just has a
> pass/fail
> >>>> for each Success Criterion and some Techniques to justify these
> >>>> claims.
> >>>>
> >>>>
> >>>> For option #2 we could introduce a scoring function in addition to
> the
> >>>> pass/fail result. This would require the evaluators to fully
> evaluate
> >>>> every page in the selected sample and count the frequencies of
> >>>> errors to
> >>>> calculate a score. It could help compare websites and motivate the
> >>>> developers (at least those who are close to full compliance).
> >>>>
> >>>>
> >>>> Finally, option #3 would be more in-depth reports with examples of
> the
> >>>> errors and explanations of ways to repair the errors. These are,
> as
> >>>> Kerstin says, developed by consultants (as opposed to pure
> evaluators)
> >>>> for developers who are new to accessibility.
> >>>>
> >>>> We attempted to provide such an example report in the initial
> >>>> version of
> >>>> the Before and After Demo (BAD) but it is really lots of work:
> >>>> - <http://www.w3.org/WAI/EO/2005/Demo/report/>
> >>>>
> >>>>
> >>>> Regards,
> >>>> Shadi
> >>>>
> >>>>
> >>>> On 19.2.2012 20:36, Elle wrote:
> >>>>> Kerstin:
> >>>>>
> >>>>> I like these three options. I am interested, however, in how many
> >>>>> clients
> >>>>> that typically ask for something as abbreviated as Option 1. For
> >>>>> those in
> >>>>> this group, do you experience situations with a lot of clients
> who
> >>>>> don't
> >>>>> want more than the pass/fail report?
> >>>>>
> >>>>>
> >>>>>
> >>>>> Regards,
> >>>>> Elle
> >>>>>
> >>>>>
> >>>>>
> >>>>>
> >>>>> On Sun, Feb 19, 2012 at 4:36 AM, Kerstin Probiesch<
> >>>>> k.probiesch@googlemail.com<mailto:k.probiesch@googlemail.com>>
> wrote:
> >>>>>
> >>>>>> Hi all,
> >>>>>>
> >>>>>> in our last teleconference we discussed a evaluation scheme with
> >>>>>> three
> >>>>>> options based upon 100% Conformance. I appreciate these
> proposals
> >>>>>> and
> >>>>>> see
> >>>>>> them as chance to integrate or point to the three documents of
> >>>>>> WCAG2:
> >>>>>> Guidelines and SCs, Understanding and How to meet.
> >>>>>>
> >>>>>> One proposal for handling the documents in an evaluation scheme,
> >>>>>> based upon
> >>>>>> the normative guidelines and SCs as core:
> >>>>>>
> >>>>>> =====
> >>>>>> Option 1: WCAG 2.0 – Core Test ("light version" or whatever the
> >>>>>> wording
> >>>>>> later will be)
> >>>>>>
> >>>>>> # Guideline X (Heading)
> >>>>>>
> >>>>>> ## Checkpoint: SC XX (Subheading)
> >>>>>>
> >>>>>> Result: pass/fail
> >>>>>>
> >>>>>> Character: global/regional (or another wording) - – if regional:
> a
> >>>>>> list of
> >>>>>> pages where the problem exists
> >>>>>>
> >>>>>> ## Checkpoint: SC XX (Subheading)
> >>>>>>
> >>>>>> Result: pass/fail
> >>>>>>
> >>>>>> Character: global/regional (or another wording) - – if regional:
> a
> >>>>>> list of
> >>>>>> pages where the problem exists
> >>>>>>
> >>>>>> (...)
> >>>>>>
> >>>>>> =====
> >>>>>>
> >>>>>> Use cases for Option1:
> >>>>>>
> >>>>>> - experienced developers and clients who know WCAG2 and need
> just
> >>>>>> the
> >>>>>> results,
> >>>>>> - comparative evaluations (20 hotel websites, city websites…)
> >>>>>> - or for example just with the SCs of level a and a smaller
> scope as
> >>>>>> pre-test to decide together with the client what the best next
> steps
> >>>>>> might
> >>>>>> be (evaluation, consulting, probably workshops for editors)
> >>>>>>
> >>>>>> =====
> >>>>>>
> >>>>>> Option 2: WCAG 2.0 – Core incl. understanding (name?)
> >>>>>>
> >>>>>> # Guideline X (Heading)
> >>>>>>
> >>>>>> ## Checkpoint: SC XX (Subheading)
> >>>>>>
> >>>>>> Result: pass/fail
> >>>>>>
> >>>>>> Character: global/regional (or another wording) – if regional: a
> >>>>>> list of
> >>>>>> pages where the problem exists
> >>>>>>
> >>>>>> Problem (Subheading): Description of existing problems and
> barriers
> >>>>>> for
> >>>>>> users (here know how out of the understanding document could be
> part
> >>>>>> of the
> >>>>>> description).
> >>>>>>
> >>>>>> ## Checkpoint: SC XX (Subheading)
> >>>>>>
> >>>>>> Result: pass/fail
> >>>>>>
> >>>>>> Character: global/regional (or another wording) – if regional: a
> >>>>>> list of
> >>>>>> pages where the problem exists
> >>>>>>
> >>>>>> Problem (Subheading): Description of existing problems and
> barriers
> >>>>>> for
> >>>>>> users (here know how out of the understanding document could be
> part
> >>>>>> of the
> >>>>>> description).
> >>>>>>
> >>>>>> (...)
> >>>>>>
> >>>>>> ======
> >>>>>>
> >>>>>> Use cases:
> >>>>>>
> >>>>>> - comparative evaluations (depending on the specific time and
> costs)
> >>>>>>
> >>>>>> - if a client just want descriptions
> >>>>>>
> >>>>>> - regular tests like "evaluation of the week"
> >>>>>>
> >>>>>> =====
> >>>>>>
> >>>>>> Option 3: WCAG 2.0 – Core, understanding, how to meet (name?)
> >>>>>>
> >>>>>> # Guideline X (Heading)
> >>>>>>
> >>>>>> ## Checkpoint: SC XX (Subheading)
> >>>>>>
> >>>>>> Result: pass/fail
> >>>>>>
> >>>>>> Character: global/regional (or another wording) – if regional: a
> >>>>>> list of
> >>>>>> pages where the problem exists
> >>>>>>
> >>>>>> Problem (Subheading): description/explanation of existing
> >>>>>> problems and
> >>>>>> barriers for users (here know how out of the Understanding
> Document
> >>>>>> could
> >>>>>> be
> >>>>>> part of the description).
> >>>>>>
> >>>>>> Action (Subheading): Description of techniques for meeting the
> SC
> >>>>>> (could be
> >>>>>> techniques which are already in the techniques document or new
> >>>>>> techniques
> >>>>>> which are not in the document, but with which the SC can be
> met).
> >>>>>> Here even
> >>>>>> usability aspects can play a role, like: you can do a, b, c or d
> –
> >>>>>> I/we
> >>>>>> propose/recommend c.
> >>>>>>
> >>>>>> ## Checkpoint: SC XX (Subheading)
> >>>>>>
> >>>>>> Result: pass/fail
> >>>>>>
> >>>>>> Character: global/regional (or another wording) – if regional: a
> >>>>>> list of
> >>>>>> pages where the problem exists
> >>>>>>
> >>>>>> Problem (Subheading): description/explanation of existing
> >>>>>> problems and
> >>>>>> barriers for users (here know how out of the Understanding
> Document
> >>>>>> could
> >>>>>> be
> >>>>>> part of the description).
> >>>>>>
> >>>>>> Action (Subheading): Description of techniques for meeting the
> SC
> >>>>>> (could be
> >>>>>> techniques which are already in the techniques document or new
> >>>>>> techniques
> >>>>>> which are not in the document, but with which the SC can be
> met).
> >>>>>> Here even
> >>>>>> usability aspects can play a role, like: you can do a, b, c or d
> –
> >>>>>> I/we
> >>>>>> propose/recommend c.
> >>>>>>
> >>>>>> (...)
> >>>>>>
> >>>>>> ======
> >>>>>>
> >>>>>> Use cases:
> >>>>>>
> >>>>>> - test incl. consulting
> >>>>>>
> >>>>>> - for clients who are not very familiar with accessibility and
> WCAG2
> >>>>>>
> >>>>>> ============
> >>>>>>
> >>>>>> For a seal/badge or any formal confirmation Option 1 is the
> minimum.
> >>>>>>
> >>>>>> A report might also / should? also have intro parts like:
> >>>>>>
> >>>>>> - Short description of the Option 1, 2 or 3
> >>>>>>
> >>>>>> - Something like a disclaimer ("results might not be complete,
> >>>>>> therefore it
> >>>>>> is important to go through the page, view all similar elements
> and
> >>>>>> solve
> >>>>>> the
> >>>>>> corresponding problems)
> >>>>>>
> >>>>>> - Glossary (for specific terms we used in our methodology -like
> >>>>>> regional/global – if we decide to use them)
> >>>>>>
> >>>>>> - Documentation of the used OS, Browsers and Versions, probably
> used
> >>>>>> assistive technologies incl. versions
> >>>>>>
> >>>>>> - Tested Conformance Level (A, AA, AA)
> >>>>>>
> >>>>>> - Results
> >>>>>>
> >>>>>> - Summary, probably written as an overall impression - we
> discussed
> >>>>>> in this
> >>>>>> list the 'motivation factor'. I think the aim of an evaluation
> is
> >>>>>> not to
> >>>>>> motivate. Nevertheless, writing a nice overall impression in a
> >>>>>> report, may
> >>>>>> have this function. Ok, except when there is nothing nice to
> say.
> >>>>>>
> >>>>>> This scheme could probably also be used for processes, pdf,
> flash
> >>>>>> and
> >>>>>> so on
> >>>>>> and I think it would be flexible enough (time, costs, ...) and
> in
> >>>>>> the
> >>>>>> same
> >>>>>> time valid against the Conformance Requirements, because the
> core
> >>>>>> (evaluation itself) is the same in every option.
> >>>>>>
> >>>>>> Important, as I see it, is that the evaluator has the three
> >>>>>> different
> >>>>>> aspects in mind and in the report, which I believe shouldn't be
> >>>>>> mixed:
> >>>>>> evaluation (Core, testing SCs), explanation (description of the
> >>>>>> problem/violation, understanding) and consulting (how to meet,
> >>>>>> usability,..)
> >>>>>>
> >>>>>>
> >>>>>> The evaluator could document the "progress toward meeting
> success
> >>>>>> criteria
> >>>>>> from all levels beyond the achieved level of conformance": If
> for
> >>>>>> example
> >>>>>> the evaluation is for Level A with Option 3 the SCs of AA could
> >>>>>> also be
> >>>>>> checked (pass/fail) without any further description or with
> further
> >>>>>> description, depending on the contract.
> >>>>>>
> >>>>>> Advantage: every evaluator or testing organization uses the
> >>>>>> methodology and
> >>>>>> a standardized 'template' for the core and the evaluation
> itself.
> >>>>>> The
> >>>>>> descriptions of existing barriers (explanatory
> part/understanding in
> >>>>>> Option
> >>>>>> 2 and 3) and the consulting part (How to meet, in Option 3)
> would
> >>>>>> be the
> >>>>>> specific added value for the clients/the evaluator/the testing
> >>>>>> organization.
> >>>>>>
> >>>>>>
> >>>>>> Thoughts?
> >>>>>>
> >>>>>> Best
> >>>>>>
> >>>>>> --Kerstin
> >>>>>>
> >>>>>>
> >>>>>> -------------------------------------
> >>>>>> Kerstin Probiesch - Freie Beraterin
> >>>>>> Barrierefreiheit, Social Media, Webkompetenz
> >>>>>> Kantstraße 10/19 | 35039 Marburg
> >>>>>> Tel.: 06421 167002
> >>>>>> E-Mail: mail@barrierefreie-
> informationskultur.de<mailto:mail@barrierefreie-informationskultur.de>
> >>>>>> Web: http://www.barrierefreie-
> informationskultur.de<http://www.barrierefreie-informationskultur.de/>
> >>>>>>
> >>>>>> XING: http://www.xing.com/profile/Kerstin_Probiesch
> >>>>>> Twitter: http://twitter.com/kprobiesch
> >>>>>> ------------------------------------
> >>>>>>
> >>>>>>
> >>>>>>
> >>>>>
> >>>>>
> >>>>
> >>>
> >>
> >
> 
> This e-mail is confidential. If you are not the intended recipient you
> must not disclose or use the information contained within. If you have
> received it in error please return it to the sender via reply e-mail
> and delete any record of it from your system. The information contained
> within is not the opinion of Edith Cowan University in general and the
> University accepts no liability for the accuracy of the information
> provided.
> 
> CRICOS IPC 00279B
> 
> 
> 
> 
> 
> 
> 
> --
> Emmanuelle Gutiérrez y Restrepo
> Fundación y Seminario SIDAR
> URL: www.sidar.org<http://www.sidar.org/>
> email: emmanuelle@sidar.org<mailto:emmanuelle@sidar.org>
> 
> 
> ________________________________
> This e-mail is confidential. If you are not the intended recipient you
> must not disclose or use the information contained within. If you have
> received it in error please return it to the sender via reply e-mail
> and delete any record of it from your system. The information contained
> within is not the opinion of Edith Cowan University in general and the
> University accepts no liability for the accuracy of the information
> provided.
> 
> CRICOS IPC 00279B

Received on Thursday, 23 February 2012 08:59:08 UTC