Re: page and site complexity measures [was Re: Web Content Accessibility Guidelines

If I remember right (don't have all books with me here in Boston). There was
something on this in Helanders book by Tullis

Helander, M. (ed, 1988) Handbook of human-computer interaction. Amsterdam:
Elsevier

Also I think Edward Tufte has written something on this, but I would need to do
more research to find the exact references.

The measurings were based on how much white space the page had and how it was
distributed to form groups. The interfaces were mostly alphanumeric. This
naturally is just the tip of the iceberg. There many important things that
simplify the presentation of data but are hard to measure in a scale.

Marja

At 03:35 PM 3/3/99 -0500, Leonard R. Kasday wrote:
>Jonathan mentioned some facts about vocabulary and page complexity. E.g.
>people whose vocabulary is less than ~2k words.
>
>Are there lists of these words.  Perhaps the evaluation tool could do a
>count of words outside the list for its ratings.   Or, getting a bit more
>sophisticaed, if there's a list of probabilities that words are outside a
>persons vocabulary, then the measure could be the statistically expected
>number of words outside the vocabulary.
>
>There are various automated reading level measures around.  Would any of
>those help?
>
>And I wonder if it would be possible to automate a measure of layout
>complexity?  I remember there were measures like that in the old literature
>associated with forms on dumb terminals.
>
>Going further with this, how about measures of site complexity?
>
>Anyone know of any literature here?
>
>Len
>
>
>
>-------
>Leonard R. Kasday, Ph.D.
>Universal Design Engineer, Institute on Disabilities/UAP, and
>Adjunct Professor, Electrical Engineering
>Temple University
>
>Ritter Hall Annex, Room 423, Philadelphia, PA 19122
>kasday@acm.org        
>(215} 204-2247 (voice)
>(800) 750-7428 (TTY)

Received on Thursday, 4 March 1999 13:20:59 UTC