- From: John M Slatin <john_slatin@austin.utexas.edu>
- Date: Fri, 24 Oct 2003 15:23:55 -0500
- To: <w3c-wai-gl@w3.org>
- Message-ID: <C46A1118E0262B47BD5C202DA2490D1A1DFB08@MAIL02.austin.utexas.edu>
Gregg said on this afternoon's joint call with ATAG that "Write clearly" isn't testable. I beg to differ. Clarity isn't *machine-testable.( However, it *is* possible to achieve a high degree of inter-rater reliability for written work. In the US, the Educational Testing Service (ETS) trains teams of people to score high-stakes examinations that can determine whether students are or are not admitted to university. These evaluators read and score student essays, and achieve enough inter-rater reliability for their results to be accepted by many coleges and universities. Teachers using the Learning Record (a portfolio-based learning assessment tool) achieve inter-rater reliability ratings of 89% last time I heard the statistics. The key is in Gregg's phrase about raters "who know what they're talking about." I would argue that it's possible to train people to read Web content and make informed judgments about its clarity. John "Good design is accessible design." Please note our new name and URL! John Slatin, Ph.D. Director, Accessibility Institute University of Texas at Austin FAC 248C 1 University Station G9600 Austin, TX 78712 ph 512-495-4288, f 512-495-4524 email jslatin@mail.utexas.edu web http://www.utexas.edu/research/accessibility/ <http://www.utexas.edu/research/accessibility/>
Received on Friday, 24 October 2003 16:31:05 UTC