RE: Additional Point/Question: problem centered / page centered evaluation

Michael
You bring up an excellent point - the post-assessment environment.  How do we assist clients to maintain the accessibility of their site, how often should a site be re-tested.  I think developing accessibility techniques such as checklists and important points for those adding content to existing sites, teaching them how to evaluate what they are adding is a good starting point.  IMHO this should be added to what we're doing as it's an important of establishing a methodology for testing websites - the on-going testing and maintenance.  Thoughts?


Regards

Vivienne L. Conway
________________________________________
From: public-wai-evaltf-request@w3.org [public-wai-evaltf-request@w3.org] On Behalf Of Michael S Elledge [elledge@msu.edu]
Sent: Saturday, 10 September 2011 12:35 AM
To: public-wai-evaltf@w3.org
Subject: Re: Additional Point/Question: problem centered / page   centered   evaluation

Hi All--

Once again I concur with Denis' point. Clients aren't capable of paying
for site-wide, manual evaluation, and evaluation tools are limited in
what they verify. This means, of course, that clients ultimately have
the responsibility for making their sites accessible. And that, as
evaluators, we have a limited amount of leverage over the accessibility
of a website.

All is not lost, however. We have found that reviewing an accessibility
report and repairing a site can give clients greater awareness and
commitment to being compliant in the future.

Which leads to another question/suggestion: shouldn't we also address
the post-assessment environment? I.e., the need to embed commitment,
technique and knowledge into organizational culture and processes?

Mike

On 9/9/2011 9:27 AM, Denis Boudreau wrote:
> Hi Vivienne, all,
>
> On 2011-09-09, at 5:00 AM, Vivienne CONWAY wrote:
>
>> I think you explained that nicely.  I am always puzzling about how someone can say a website meets WCAG 2.0 AA if they haven't tested every single page for every single SC. Testing representative pages will give us a good 'idea' of the accessibility of the website, but I don't think anyone could/should certify a website using this method.  What if we missed a page with a critical problem?  And again, how do you locate every problem/page?  Being able to locate issues is one of the best arguments for using automated tools to ASSIST in an evaluation - they tend to help us see patterns/trends.  Thoughts?
> As an organization that does accessibility certification with various international and north american standards (WCAG 2.0, CLF 2.0, SGQRI 008 and Section508), we've had to struggle with that for quite a while. So I feel I should react to your question. ;p
>
> As you know, doing a complete assessment on a single web page takes time. For us, going through our checklist takes about 2 hours per page, AT testing included. We charge around 300,00$ CDN to do that. Now imagine that the client has a rather small website, with "only" 500 pages or so. Doing as you suggest would mean a 150,000$ project, probably twice as much as what it cost them to get the website in the first place. Of course, going through so many pages would mean we can get the job done quicker on most pages as you only have to do your templates once, but still, with so many tests to run on content, you can hardly go below 1 hour per page... So, given we can cut down on time, we'd still talking about 150,00$ per page, for about 500 pages, so a whooping 75k. Totally insane.
>
> So while it's true that in order to be sure that you don't miss out on critical problems, you'd have to go through each and every one of them, it's not just realistic to think that can be done... nobody has that kind of money. And those that might have much larger websites, with tens of thousands of pages... "totally insane" takes on a totally new meaning.
>
> Which is why I was mentioning only getting a sample of critical pages, based on various templates and then, asking the client to make the required improvement on all pages themselves. Yes, we might be missing on a few things, but it still gives incredibly useful and relevant results that allow for a website to become much better accessibility wise. There are tools out there (Deque, HiSoftware and IBM being a few of the companies providing them) that can monitor thousands of pages in a very short time and those prove exceptionally useful to get a general idea and we're hoping to be able to add that to our methodology ourselves soon. If you can afford those tools, then you could do a thorough manual audit on sample pages while completing it with an automated sweep of the whole site. Then you actually reduce your error margin by a lot and your certification process would be more reliable.
>
> As an organization, we'Re not quite there yet, but we're working on this. When we label a website as WCAG 2.0 compliant for example, we'll note that the certification is based on a sample of pages, audited at a specific point in time. We can even provide the source code that was audited because of course, websites tend to degrade over time so unless the organization does a great job at maintaining, the website rarely remains compliant over time. And we deliver a label for a one year period, after which a new audit is required to renew the label.
>
> I agre, this is not the ideal solution, but I think it's the best that can be done at this moment. This is one of the reasons we're part of this group. We are very aware of the limitations of this method and look forward to improving it quite considerably over the next two years with you all.
>
> Questions and comments are welcomed.
>
>

This e-mail is confidential. If you are not the intended recipient you must not disclose or use the information contained within. If you have received it in error please return it to the sender via reply e-mail and delete any record of it from your system. The information contained within is not the opinion of Edith Cowan University in general and the University accepts no liability for the accuracy of the information provided.

CRICOS IPC 00279B

Received on Saturday, 10 September 2011 00:19:36 UTC