RE: 3.1 - simple writing is testable...

Lisa, thanks for the suggestions about what kinds of statements might be
included in metadata.
 
Guideline 3.1 has success criteria at both Level 1 and Level 2 that
address some of your suggestions, and I think you've indicated a way
that some of the suggested "Strategies for reducing complexity" could be
moved up from Level 3 to Level 2.
 
Existing success criteria that address your points : include the
following:
At level 1
2. The meaning of abbreviations and acronyms can be programmatically
located.
 
At Level 2
2. The meanings and pronunciations of all words in the content can be
programmatically located.
[I]
3. The meaning of all idioms in the content can be programmatically
determined.
Metadata statements about word count, maximum sentence-length, average
sentence-length, percentage of sentences in passive voice, and reading
level could also be required, as per Lisa's suggestion.  Calling for
such metadata at Level 1 would simply be a requirement to provide
descriptive information about the content-- the longest sentence might
be 50 words, average sentence-length could be 33.7 words, etc.  At Level
2, there could be constraints on sentence-length, percentage of
sentences in passive voice, readability level, etc.
 
It should be noted that there are widely available tools that generate
these statistics.  For example, Microsoft Word has an option to generate
readability statistics after completing a spelling and grammar check
(Tools | Options | Grammar and spelling | Show readability statistics).

 
Jason sent me a message the other day about the UNIX style program,
which can generate an even more comprehensive profile of text documents.

 
As Wendy and Katie pointed out back in the fall when we first started
working toward a plain language version of the guidelines, Stylewriter
(http://www.editorsoftware.com/stylewriter-software/) can check
documents for plain language 
issues.  (Stylewriter itself isn't accessible, though, so I haven't been
able to try it out myself.)
 
It may not be necessary to run numbers on an entire site-- many of the
disucssions I've seen talk about using  samples-- for example, five
chunks of text of 100 words each.  The Plain Language Audit Tool from
the Northwest Territories Literacy Council explains how to do a
readability check manually
(http://www.nwt.literacy.ca/plainlng/auditool/8.htm). 
 
I'm not aware of any Web-authoring tools that include such
functionality.

John
"Good design is accessible design." 
Please note our new name and URL!
John Slatin, Ph.D.
Director, Accessibility Institute
University of Texas at Austin
FAC 248C
1 University Station G9600
Austin, TX 78712
ph 512-495-4288, f 512-495-4524
email jslatin@mail.utexas.edu
web http://www.utexas.edu/research/accessibility/
<http://www.utexas.edu/research/accessibility/> 


 

-----Original Message-----
From: w3c-wai-gl-request@w3.org [mailto:w3c-wai-gl-request@w3.org] On
Behalf Of Lisa Seeman
Sent: Sunday, June 06, 2004 2:02 am
To: w3c-wai-gl@w3.org
Subject: RE: 3.1 - simple writing is testable...


I would further like to add that over the years there have been a=many
suggestions of simple writing success criteria That no one has argued is
not testable and would not restrict freedom of expression.
 
these include:
 (note: this is _not_ all of them)
 
 A meta data simple language policy statement, where the author sets, in
a machine readable form,   boundaries for use on the page, and then
conformance is testable to that meta data statement.
 
This would include the author/policy maker setting such as: 

*	A public/metadata statement as to maximum words per sentence
and sentences per paragraph 
*	A public/metadata statement as to tenses and number of
conjunctions in a paragraph

 It would also help people find sites that are useful to them, and
search engines for rankings etc... More to the point, it will make
policy maker actively decide who they are leaving out and why. We can
have example recommended metadata statements for a typical ecommerce
site, a government site, etc... But that would be non normative.
Another testable idea was using words from a simple language lexicon and
putting any extra words in a glossary (so you can , in fact, use any
word -so long as it is in a glossary)
 
all the best
 
Lisa

	-----Original Message-----
	From: w3c-wai-gl-request@w3.org
[mailto:w3c-wai-gl-request@w3.org] On Behalf Of Lisa Seeman
	Sent: Wednesday, June 02, 2004 12:07 PM
	To: w3c-wai-gl@w3.org
	Subject: 3.1 - simple writing...
	
	
	Simple writing has been moved down to level 3
	 
	 
	<Quote> "There is a statement associated with the content
asserting that the Strategies for Reducing the Complexity of Content
(the following list) were considered."</Quote>
	 
	I can not see why this is not level one. 
	 
	But you all probably know that.
	 
	why can't every site at least consider writing clearly? And the
existence of a  statement (possibly in meta data) seem to me to be
testable
	 
	I think someone needs to explain it to me over a beer or
something.  I just don't get  it 
	 
	All the best

	Lisa Seeman

	 

	Visit us at the UB Access <http://www.ubaccess.com/>  website

	UB Access - Moving internet accessibility

	 

	 

Received on Monday, 7 June 2004 10:59:43 UTC