RE: Readability tests

Sorry I was away last week and did not check emails that often.

In this last thread I was recommending tests  that do not guarantee
conformance or accessibility, but are a useful as a  yard stick and as
an alarm bell. To some extent all of our tests are like that. 

Example of known test: Flesch Reading Ease

There is a lot to say about different approaches suggested over the
years on this list to readability tests, I have pulled out a few
approaches from past emails to this list as a selection:



Approach 1  "flexible"
 There is a public available policy statement of acceptable maximum
length of nown phrases,  sentences length,  paragraphs and that this
policy is conformed to on the site  There is a public available policy
statement of acceptable number of words in sentences  and that this
policy is conformed to on the site  There is a public available policy
statement of acceptable maximum number of  sentences in  paragraphs and
that this policy is conformed to on the site  Then a note on how to make
a policy statement ( e.g.- a policy statement should  balance  and
justify the following factors - what length etc is truly necessary on
the site, v the people who will find it harder to understand the site.)

The key term or idea of each paragraph is easily identifiable
(techniques: through markup like em, or by "front loading") This is
human testable - is a key term highlighted? Id it near the front


Approach 2:

In general confirm that any discrepancies from the below were necessary

more then one   conjunction per sentence
Check  sentence length (over 20 words)
Check paragraph length
Check wording against a simple language lexicon / or provide a glossary
refrences Check  tense of sentence style aids comprehension Check for
synaptic and Syntactic ambiguity Check number of paragraphs under
heading Check implied meanings 

Approach 3: Consider what cognitive skill are truly necessary to
understand the ideas in a page.

For example , to understand the subject would it be essential that the
user:

Has a visual memory
Understand jokes and innuendos
Understand sarcasm
Have average auditory/word retention skills
Have average language skills. (use of words)
Have average reading skills
(initial list)

Having made a profile of what skill are the minimum to understand the
ideas, check whether more skills are necessary to understand the
language and writing skills used and test site against that profile.

If content is lost  - fail

links and bibliographies other emails etc...:

review of  Telecommunications Problems and Design Strategies for People
with Cognitive
Disabilities:
Annotated Bibliography and Research Recommendations by Ellen Francik,
Ph.D.

http://www.wid.org/archives/telecom/Telecom.pdf  see

http://lists.w3.org/Archives/Public/w3c-wai-gl/2001OctDec/0456.html

Chapter 2 should probably be essential or recommended reading
http://www.wid.org/archives/telecom/chapter2.html




review of  guidelines for autism from David Potter of the UK National
Autistic Society
http://lists.w3.org/Archives/Public/w3c-wai-gl/2001OctDec/0572.html
http://www.acf.dhhs.gov/programs/add/Factsheet.htm, -January 25, 2002
http://www.mang.canterbury.ac.nz/courseinfo/AcademicWriting/Flesch.htm
AHRC New York at http://www.ahrcnyc.org/ DSM-IV (American Psychiatric
Association, 1994),

AHRC New York City http://www.ahrcnyc.org/index.htm FAQ's Sheet (2002)



> -----Original Message-----
> From: w3c-wai-gl-request@w3.org 
> [mailto:w3c-wai-gl-request@w3.org] On Behalf Of Maurizio Boscarol
> Sent: Thursday, October 14, 2004 2:20 PM
> To: Gez Lemon
> Cc: w3c-wai-gl@w3.org
> Subject: Re: Readability tests
> 
> 
> 
> Gez Lemon wrote:
> 
> >>What tests for clear writing do you know of Lisa.  Please send 
> >>thoughts.
> >>    
> >>
> >
> >I've implemented a service on Juicy Studio that determines 
> the Gunning 
> >Fog Index, Flesch Reading Ease, and Flesch-Kincaid Grade of a web 
> >document <http://juicystudio.com/fog/>. I'm not sure of its 
> usefulness 
> >(if any) for languages other than English, and I'm also not 
> convinced 
> >about the underlying principles behind the algorithms. The 
> algorithms 
> >favour short monosyllabic sentences, regardless of whether 
> the sentence 
> >makes sense. Obviously, it's possible to get a good score with 
> >gobbledy-gook, but I've had quite a lot of positive feedback 
> about its 
> >usefulness. Could make a starting point?
> >  
> >
> 
> I think it is a good starting point to experiment. In fact, we need 
> empirical data of this readability indexes usefulness on the web. 
> Consider that in a 1999 book of Jared Spool (Web site usability: a 
> designer's guide, Morgan  Kaufmann Publishers), he used the 
> Gunning Fog 
> Index and the Flesch-Kincaid Grade in web pages and found that:
> - the less readable the site was, the more user were 
> successful with the 
> site
> - the less readable the site was, the more users found the 
> site clear, 
> complete, satisfying and useful.
> 
> Yes, just the opposite of what you expect. :) And the 
> opposite of usual 
> behavior offline.
> 
> The hypotesis is that this is due to different activity users 
> make on a 
> web page. Usually they don't read, they skim the page looking for 
> something useful. This behavior is explained also in some recent 
> "semantic" usability models on the web, like the Pirolli "information 
> scent" and Blackmon, Polson and Kitajima "Cognitive 
> walkthrough for the 
> web".
> 
> Moreover, like you noted, in languages different from english 
> all this 
> linguistic tools need to be revalidated.
> 
> To have a tool like your may help a lot to experiment 
> further. I'd like 
> very much to have such an easy tool to play with in italian 
> language... ;-)
> 
> Maurizio Boscarol
> http://www.usabile.it/
> 
> 

Received on Sunday, 17 October 2004 14:56:38 UTC