W3C home > Mailing lists > Public > wai-eo-editors@w3.org > September 2014

User feedback on WCAG-EM Report Tool

From: Kevin White <kevin@w3.org>
Date: Wed, 10 Sep 2014 15:52:44 +0100
Message-Id: <CB184DAB-1D19-4600-8DC9-3F2355FD00E5@w3.org>
To: EO Editors <wai-eo-editors@w3.org>
Hi All,

I asked a few friends to take a look at the WCAG-EM report tool. I received two responses. They are both from  seasoned accessibility and usability consultants, thoroughly cognisant with WCAG and how to run an audit. They more or less follow WCAG-EM although they wouldn’t know it as they aren’t aware of WCAG-EM.

I have provided their raw commentary below:

Participant 1

> I found it confusing, to be honest. Being quite experienced in auditing websites for conformance to WCAG I didn't know about the WCAG-EM methodology (yes, it might be me being lame :). I always approached these things intuitively and followed your guides, so I didn't know something like WCAG-EM exists. I understand this might impact the experience of using the tool, however...
> 1. My first impression was: "I don't really know what is this generating". I mean, I looked at the end and I have of course found what I expected, but I'd love to see some sample output first. I've read this:
> "This tool helps you generate a report according to the Website Accessibility Conformance Evaluation Methodology (WCAG-EM). The tool does not perform any accessibility checks. It provides options that allow you to record information about the website and the evaluation, including the evaluation results."
> and thought: why do I need it, then? Is it just to follow the framework of thinking?
> 2. I found it difficult to understand what to put in different text boxes. I think that the language of it I could digest, but if this was to be used by one of less senior members of staff, they'd be lost (even being quite good UX specialists in their own right). Steps 2 and 3 I've found particularly difficult to understand. For instance, the area where the tool asks for "Variety of web page types". Even with the contextual help it left me feeling lost.
> 3. What was particularly puzzling is that it wasn't clear to me what format should I use for entering data into the boxes. Does it want a list, or a paragraph? Or comma-separated list of things? Aww.
> 4. This lead me to the conclusion that I would like to see some kind of training video perhaps (?) or something similar on how to use this tool to its full potential.
> I am actually quite surprised that this came to be so difficult - which makes me thinking there's something wrong with me (and clearly I don't want to feel like this being an user :)
> My general impression was: I could use this, but why should I use this over what I am doing at the moment, which is more of a manual process, but one that I know well and it works? This is not to criticise; I'd be the first to pick this up if it'd speed me up but I don't know if it does.
> It looks and creates the impression of something very useful and powerful, but it also builds up an expectation of a tool that's certainly not quick to use and requires quite a lot of learning. That's absolutely fine, but perhaps I wasn't prepared for that? I dunno.

Participant 2

> I was quite unsure what I was going to get from the overview on the first page. 
> 1. Define the scope. 
> I wasn't really sure what to put in the fields. e.g. I wasn't sure what to put in the first field, so clicked on the icon.
> Define the scope of the website, so that for each web page it is unambiguous whether it is within the scope of evaluation or not. 
> That doesn't help me know what I'm supposed to put there.
> I'd like simpler language.  e.g. evaluation commissioner.  
> I ended up having to click on every possible information link, some of this info could be brought out of the link and onto the page to make it less trouble. I'd like some more help to fill it in. e.g. When it comes to Accessibility support baseline, it could have checkboxes with some examples I can choose, and I could add others. 
> 2. explore target website. Again, using very strange language. And found it diffiucult to understand what to put in the fields, even after reading the help. 
> 3. Select sample
> Have I not done that on the previous page where I put the URLs in?
> I can't actually work out how to select any of the links within the drop downs that I added previously.
> 4. Audit sample
> Ah ha. Now I'm recognising stuff. 
> Once I'd worked out the relationship between the pages with the checkboxes and the individual pages I found it easier.
> I'd maybe order it so you do the individual pages first, then write about the entire sample. e.g. I'm unlikely to know if the entire sample has passed until I've looked at the individual pages. I realise that the entire sample bit is probably at the top for the report with individual pages below, but the order just seemed a bit backwards. 
> 5. Report findings
> Pretty straightforward. 
> 6. View report
> Pretty straightforward.
> So, overall, I found steps 1-3 very confusing. I didn't know what to put in the fields, I found the language used very difficult to comprehend without seriously thinking hard.
> Step 4 was easier as it things I started to recognise, although the UI was a bit confusing order wise.
> Step 5 and 6 straightforwards. 
> The output for this is actually really good. It's just that it's difficult to get there. 



Kevin White
W3C Web Accessibility Initiative (WAI)
e-mail: kevin@w3.org
phone: +1 617 532 0912
phone: +44 131 202 6009
about: http://www.w3.org/People/Kevin/

Received on Wednesday, 10 September 2014 14:53:20 UTC

This archive was generated by hypermail 2.4.0 : Tuesday, 23 June 2020 20:41:46 UTC