W3C home > Mailing lists > Public > w3c-wai-eo@w3.org > January to March 2006

Betr.: [Demo] Evaluation Report

From: Swan, Henny <Henny.Swan@rnib.org.uk>
Date: Wed, 15 Mar 2006 09:09:44 -0000
Message-ID: <7DCC97516CAEE343BD17A00F900754E1056036B0@jstmsx01.ads.rnib.org.uk>
To: <w3c-wai-eo@w3.org>, <shadi@w3.org>, <E.Velleman@bartimeus.nl>


Hi all,

Eric, interesting point about having the generic advice on how to fix
identified problems in a separate document to the Evaluation Report. I
agree that having a separate document is useful as much information is
often repeated. However, an evaluation can also be intended to provide
targeted advice on how to fix issues specific to the site identified
during an audit. Due to this we offer clients both types of evaluation;
a  full evaluation ("See it Right audit") and "summary overview". 

The "See it Right audit" takes a partial approach to the one suggested
by Eric. The deliverables are in three documents:

1. A background document outlining generic fixes for issues 
2. An audit report outlining specific fixes for identified issues in the
site 
3. A summary document

Breaking it down into these three documents and dividing general
background and site specific feedback is necessary so that clients can
see how fixes are applied to their sites rather than sites in general.
This is really the value of the full audit as opposed to an summary
overview. Often there are a range of suggestions that can be offered to
fix one given issue and it is here we can help the client identify what
is best practice and most appropriate for them - something that can not
always be achieved in a generic document. The third part, the summary
document, then acts as a working document where roles and
responsibilities are logged as well as the solutions the client decides
to adopt. This is either done with or without our input. The key here is
that the "See it Right" audit's objective is to accredit a site.

The "summary overview" looks at each checkpoint and marks them as pass,
fail and not applicable. No site specific detail is given other than a
reference example and URL. The "summary overview" is also issued with
the background document. It is often used for benchmarking and sites can
not be accredited as a result of this piece of work. 

The reason we offer these two approaches is really down to client demand
and their objectives of having the site evaluated. The objectives can
include a number of things; benchmarking, accreditation,
training/educational, analysis to inform design documentation, strategy
or quality assurance documentation (to name a few). All of these can
potentially have different approaches to evaluation.

Regarding Shadi's comments below I agree that there should be:

a) better clarification of the scope of this particular report and
b) alternative examples of an Evaluation Report 

More context and clarification needs to be given to the scope of the
purposes of writing this Evaluation Report example and then this being
clearly communicated in the "Using the demonstration" page. This could
maybe be in the form of a mock "project brief" stating City Light's
identified objectives, desired outcomes and measurables.

Many thanks, Henny 

-----Original Message-----
From: w3c-wai-eo-request@w3.org [mailto:w3c-wai-eo-request@w3.org] On
Behalf Of Shadi Abou-Zahra
Sent: 13 March 2006 12:50
To: Eric Velleman
Cc: w3c-wai-eo@w3.org
Subject: Re: Betr.: [Demo] Evaluation Report


Hi Eric,

Thank you for your input. Please find some comments below:

Firstly, we did not set a requirement to follow EN 17020 ("General
criteria for the operation of various types of bodies performing
inspection"), nor is this report (and the whole demo) intended to be a
best practice resource for certification. The goal of this deliverable
is to help raise awareness on Web accessibility issues, and provide
developers with concrete examples. Please see "Using the Demonstration"
section for more background:
  <http://www.w3.org/WAI/EO/2005/Demo/#howto>

Secondly, we may add other types of reports (and/or educational
resources) to expand the demo as we go along but for now we chose to
provide the more educational approach for the report. Many organizations
provide boilerplate information about the issues encountered during an
evaluation review within their audit reports, in order to help
developers better understand the issues and achieve more sustainable
level of quality. Also our evaluation template has a section about
"Results and Recommended Actions" (as opposed to "results" only):
  <http://www.w3.org/WAI/eval/template.html#results>

All in all, it seems to me that we need to better clarify the scope and
intent of this report (and the demo as a whole). "Meta-introduction" as
someone suggested during the previous EOWG call...

Regards,
  Shadi


Eric Velleman wrote:
> Hello Shadi,
> 
> Very much like your document and the before after website. We use the 
> evaluation report template for our reports all the time. We do about 
> 150 website evaluations per year and the reactions on the format are 
> very good.
> But in the reports, we  cannot add the advice as you did because of EN

> standard EN17020. Evaluators cannot give advice. That would more or 
> less compromise the independance of their (next) evaluation. Also in 
> doing this you/we would be repeating ourselves all the time as we see 
> the same problems on many websites. As a solution, we point to a 
> seperate brochure with all the explanations and many practical code 
> examples. If we encounter a new problem, we add a new part to the 
> brochure. My advice would be to take the advice out and point to 
> another resource for that like the curriculum.
> 
> Also it would be good to add the resources that were in the 
> scope/sample at the end of the document, although it is quite obvious 
> what they are in this case. For larger websites, this is very 
> interesting and necessary for conformancy with evaluation standards.
> 
> Kindest regards,
> 
> Eric Velleman
> Accessibility Foundation
> www.accessibility.nl
> 
> 
> 
> 
> 
>>>> Shadi Abou-Zahra <shadi@w3.org> 10-3-2006 13:48:05 >>>
> 
> Dear Group,
> 
> Apologies for the delay in sending this so shortly before the meeting,

> I've been trying to put in everything we need for discussion on 
> today's call.
> 
> Please find the evaluation report for the Before/After Demo (BAD):
>   <http://www.w3.org/WAI/EO/2005/Demo/report>
> 
> Note: each of the sections in this report has enough content for 
> intense discussions so I propose that we use today's time to focus on 
> the overall structure and organization of the report. Specifically the

> following aspects:
> 
> * If you were expecting an evaluation report for your Web site, is 
> this the type of information you would want in it? (Consider being a 
> manager or a developer)
> 
> * If you were writing an evaluation report for a Web site, does this 
> report help you as a model to follow?
> 
> * If you wanted to learn more about Web accessibility, does the 
> evaluation sufficiently outline some of the issues?
> 
> 
> Looking forward to your input.
> 
> Regards,
>   Shadi
> 
> 

-- 
Shadi Abou-Zahra     Web Accessibility Specialist for Europe | 
Chair & Staff Contact for the Evaluation and Repair Tools WG | 
World Wide Web Consortium (W3C)           http://www.w3.org/ | 
Web Accessibility Initiative (WAI),   http://www.w3.org/WAI/ | 
WAI-TIES Project,                http://www.w3.org/WAI/TIES/ | 
Evaluation and Repair Tools WG,    http://www.w3.org/WAI/ER/ | 
2004, Route des Lucioles - 06560,  Sophia-Antipolis - France | 
Voice: +33(0)4 92 38 50 64          Fax: +33(0)4 92 38 78 22 | 


-- 
DISCLAIMER:

NOTICE: The information contained in this email and any attachments is 
confidential and may be privileged.  If you are not the intended 
recipient you should not use, disclose, distribute or copy any of the 
content of it or of any attachment; you are requested to notify the 
sender immediately of your receipt of the email and then to delete it 
and any attachments from your system.

RNIB endeavours to ensure that emails and any attachments generated by
its staff are free from viruses or other contaminants.  However, it 
cannot accept any responsibility for any  such which are transmitted.
We therefore recommend you scan all attachments.

Please note that the statements and views expressed in this email and 
any attachments are those of the author and do not necessarily represent
those of RNIB.

RNIB Registered Charity Number: 226227

Website: http://www.rnib.org.uk
Received on Wednesday, 15 March 2006 09:10:43 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 20:29:38 UTC