RE: ATAG2.0 A.3.6.1 draft test - more work needed - per my action item

Hi Tim,

Nice...but to show some mercy for evaluators I think we can tell them that the tool can FAIL as soon as they found even one instance where the SC isn't met....And I've tried to simplify a bit:

1. Determine (from the user interface or documentation) whether the authoring tool includes settings that affect how the content being edited is perceived by the author. If these settings exist, then document them. If not, then select SKIP.

2. Create/open web content with the authoring tool.

3. Determine a "method for testing how the web content will be experienced by end-users" (this may be as simple as opening the content in a user agent; or it may involve ending the authoring session).

4. For each setting from Step 1, change the setting to a different value. Try to choose values that are as different as possible from the starting values, since this will make detecting differences easier. After changing each setting, save the content and then use the "method for testing how the web content will be experienced by end-users" (from Step 3). If the end user experience has changed for any of the settings, then select FAIL.

5. If all settings can be changed without affecting the produced web content, then select PASS.




(Mr) Jan Richards, M.Sc.
jrichards@ocadu.ca<mailto:jrichards@ocadu.ca> | 416-977-6000 ext. 3957 | fax: 416-977-9844
Inclusive Design Research Centre (IDRC) | http://idrc.ocad.ca/
Faculty of Design | OCAD University

From: Boland Jr, Frederick E. [mailto:frederick.boland@nist.gov]
Sent: August-01-12 3:47 PM
To: w3c-wai-au@w3.org
Subject: ATAG2.0 A.3.6.1 draft test - more work needed - per my action item

A.3.6.1 test:

1. Check to determine if the authoring tool includes settings that affect how the content being edited is perceived by the author (this can be done from documentation or author experience with tool).  If no such settings are included, then this SC is N/A for this authoring tool.  If there are such settings, then document those settings/values and proceed to Step 2.

2. Create/open web content with the authoring tool.  Capture/document the actual content at this point in time, as well as how the content is perceived by the author (two separate characterizations) with the settings/values from Step 1.

3. Attempt to change each setting from Step 1 to a specified different value, one at a time (only change one setting value at a time - leaving other values unchanged, to avoid possible interactions between many settings being changed at once which may make results "ambiguous" to a certain extent) - loop.  Use the setting values from Step 1 as the "base" or "control" of the experiment.  If a setting value is not able to be changed, then this setting is marked -fail- for this authoring tool; go to Step 6.

4. For each setting value changed, recapture/redocument "same" (from Step 2) content at point in time immediately after the setting value was changed  ("same" means that no explicit editing took place between steps 2 and 4 by the author)

5. Check that the perception of the content is consistent with the changed setting value (reflects the changed setting value - "different" from the perception in Step 2 - record the change in perception), AND check that the actual content is unchanged from the actual content in Step 2 (by doing a character-by-character comparison if necessary).   If and only if both these checks pass simultaneously, the setting is marked -pass- for this authoring tool; for all other outcomes the setting is marked -fail- for this authoring tool.  Go back to Step 3 and repeat Steps 3-5 for a different setting unless all settings have been checked, in which case proceed to Step 6.

6. If all settings checked have been marked -pass-, then this SC is passed for this authoring tool.  If one or more settings checked have been marked -fail-, then this SC fails for this authoring tool.


Thanks and best wishes
Tim Boland NIST

Received on Friday, 3 August 2012 17:32:18 UTC