W3C home > Mailing lists > Public > public-html-a11y@w3.org > December 2009

Re: CHANGE PROPOSAL: Table Summary

From: Laura Carlson <laura.lee.carlson@gmail.com>
Date: Sun, 6 Dec 2009 17:04:35 -0600
Message-ID: <1c8dbcaa0912061504i77763a0coc50190a3ae632e58@mail.gmail.com>
To: Ian Hickson <ian@hixie.ch>
Cc: HTML Accessibility Task Force <public-html-a11y@w3.org>, Roger Johansson <roger@456bereastreet.com>
Hi Ian,

> I should first say that it is
> well-established that asking questions in usability studies leads to
> results that are dramatically biased towards a supportive answer

I agree with that. In formal usability tests, if a facilitator is
asked for help, he or she should say that the goal is for the
participant to try to solve the problems on their own. A facilitator
should to stay neutral in words and body language. And they should be
careful not to ask leading questions that may skew the participant's

But surveys are a type of usability evaluation tool [1]. They have
strengths and weaknesses, as do other evaluation methods. WebAIM has
conducted a couple surveys of screen reader users [2] [3]. I was
thinking about a survey along those lines with questions such as Roger
posed. It wouldn’t be as labor intensive as full-blown usability
testing and might provide us some information... Another thought:
doing one wouldn’t preclude doing the other.

> Also, it's important to remember that nobody (as far as I know) is arguing
> that table explanations are undesireable

It is great that part of the table summary issue is now agreeable.

> I'm not sure that a survey would be the best way of collecting data about
> the best way to improve accessibility. I think a better way to get data
> about this would be a set of usability studies of Web authors followed by
> double-blind studies of the pages they write. For example, take six to
> nine Web developers, and give them the task of marking up some Web pages
> that include particularly complex data tables in an accessible way that is
> still aesthetically pleasing to them. The developers would be split into
> three groups, one being given instructions on using summary="", one being
> given instructions on writing paragraphs around the table, and one being
> given no instruction at all. Then, take the resulting pages, and bring in
> six to nine users of assistive technologies, and randomly give each one
> some of the pages created, and ask them to fill in a questionnaire based
> on the data in the table. Then, a researcher who is not aware of any of
> these events is asked to "grade" the questionaires, and determine which
> show a better understanding of the underlying data.

Interesting proposal. It has quite a few phases and variables.

Best Regards,

[1] http://www.d.umn.edu/itss/support/Training/Online/webdesign/testing.html
[2] http://www.webaim.org/projects/screenreadersurvey/
[3] http://www.webaim.org/projects/screenreadersurvey2/

Laura L. Carlson
Received on Sunday, 6 December 2009 23:05:11 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:55:27 UTC