Re: International Web Access Guidelines “Ineffective” , PhD thesis Claims

Grretings all,

And in the shameless plug department, the nonprofit group for which I 
work (Knowbility) maintains a large database of people with 
disabilities for remote user testing through Loop11, an Australian 
usability company.  While not quite as robust as in-person user 
testing, remote testing has been shown to be an effective (and 
relatively inexpensive) way to conduct inclusive usability tests. 
Here's the link: 
<http://www.knowbility.org/v/service-detail/AccessWorks-Usability-Accessibility-Testing-Portal/3k/>AccessWorks<http://www.knowbility.org/v/service-detail/AccessWorks-Usability-Accessibility-Testing-Portal/3k/> 
Testing Portal

Best,
Sharron
------------------------------------------------------------------------------------------------------------
Sharron Rush
Executive Director | Knowbility.org |
Equal access to digital technology for people with disabilities

At 03:55 AM 6/3/2013, Alastair Campbell wrote:
>Hi Steve,
>
>I think in general we are in vociferous agreement! ;-)
>
>Steve Green wrote:
> > In general it is more costly and logistically more difficult to 
> conduct user testing with PWD than it is for fully-able users, and 
> also it is more difficult to interpret the results because there 
> are more factors involved.
>
>Generally true, but having done so much of it now the increase in cost
>is relatively small in the scheme of things. Extra costs tend to be
>transport and perhaps a BSL interpreter, but it is not that much
>extra. We like to have regular user-research people facilitate the
>testing and create the list of issues, but then work with an
>accessibility expert (developer) to come up with recommendations. That
>keeps the results honest and the recommendations useful.
>
>
> > Whether it is a new build or an audit of an existing website (as 
> all those in the study were) it is most efficient to conduct user 
> testing with fully able people, fix any issues arising from that 
> and then conduct user testing with PWD.
>
>Taking a step back, testing is not the only way. UCD projects should
>include many aspects of user-research before you ever get to testing.
>Surveys, card sorts, etc, depending on the needs of the project.
>
>Also, we sometimes have clients who have conducted lots of usability
>testing already and are looking for the next level of improvements,
>and PWD often uncover issues that do affect most people  to some
>degree, but are easy to miss in regular testing.
>
>I'm sure you agree that it depends on the state of the site and the
>needs of the project.
>
>
> > It is worth bearing in mind that user testing only assesses the 
> usability and accessibility of the selected scenarios and the paths 
> the participants choose to take through the website. For this 
> reason it is essential to conduct WCAG audits and expert reviews 
> that methodically go through the whole website (or as much as is 
> practical). This is why I believe the study's conclusion is incorrect.
>
>Sure, but if those scenarios are key ones (e.g. "buy something" on an
>ecommerce site) they provide critical data, and are helpful to
>prioritise the changes. Like an audit which takes a sample of pages,
>it is then up to the researcher / developer to generalise the findings
>and make best use of resources to improve the site.
>
>Cheers,
>
>-Alastair

Received on Monday, 3 June 2013 13:18:53 UTC