Heuristic testing

Hi everyone,

Jeanne mentioned a doc on the AG call today: Heuristics for Silver Conformance
https://docs.google.com/document/d/1lEkht-bhkaPMzOojWpDjZhbnGeckiYLh-J4-ze7cGS0/edit#

The document isn't open for comments, so a few comments here. For context: My background was originally Human Computer Interaction, I started off my professional life doing many heuristic "usability" evaluations for an agency.

The intent for heuristic evaluations was originally to come up with issues & recommendations based on the experience of the tester(s). Getting from that to a score is very arbitrary, and very prone to individual (tester) differences.

The history section notes the 80% consistency rate for WCAG, but I'd put money on that number being far lower for a heuristic approach due to the higher-level nature of the heuristics.

(Does a quick search), "when professional evaluators conducted heuristic evaluations, the most likely outcome was that about half of the problems identified would be true problems and half would  be  false positives." [1]
My bolding of 'professional' as that's a best case!

In short, I don't see how the use of heuristic evaluation could meet the requirement to "have tests or procedures so that the results can be verified", at least in any faintly feasible way. That isn't what the method was created for.

I could see it being part of the gold/platinum level of conformance, or perhaps as a process check ("Have you done usability testing/heuristic evaluation and acted on the results?")

Also, from the document: "Since a heuristic evaluation should be conducted by two or more expert evaluators" and later "Evaluators should have domain knowledge - not only in the product area, but in the heuristic area."

This is would massively increasing costs in general, and make it something that small organisations cannot do themselves.

I'm all for new ways of measuring things, e.g. getting away from instant fails for minor-issues. However, the guidelines do need to spell out the way it is measured. It can't rely on having a group of experts for each evaluation agreeing between themselves.

Sorry to trample in, I was just intrigued by Jeanne's comment and wanted to find out more, then it hit a nerve of past experiences!

Kind regards,

-Alastair

1] https://www.researchgate.net/profile/Michael_Muller/publication/221515225_Usability_in_practice_Alternatives_to_formative_evaluations_-_Evolution_and_revolution/links/54dac1b70cf2ba88a68dd7de/Usability-in-practice-Alternatives-to-formative-evaluations-Evolution-and-revolution.pdf Page 886.

--

www.nomensa.com<http://www.nomensa.com/>
tel: +44 (0)117 929 7333 / 07970 879 653
follow us: @we_are_nomensa or me: @alastc
Nomensa Ltd. King William House, 13 Queen Square, Bristol BS1 4NT

Company number: 4214477 | UK VAT registration: GB 771727411

Received on Tuesday, 23 April 2019 23:41:57 UTC