- Success Criteria 1.1: The authoring interface must pass required elements of the Software
Accessibility Guidelines testing criteria
NOTE: Tool designer is assumed knowledgable of tool; author is one who uses tool (no prior knowledge of tool assumed,
but can be the same
as tool designer, and can be many authors performing tests on a single tool). Test plan may include combinations of machine-testing and human-controlled testing.
"Form" below includes a public listing of pertinent test data (including test number, success criteria
number, individuals involved, and results.)
Test Plan
- tool designer lists capabilities of authoring interface on form (and other documentation
necessary for author to use)
- author lists whether ISO16071 or IBM SAGs are referenced for testing on form
- author lists required elements of ISO16071 on form
- author lists required elements of SAG on form
- author lists which required elements are passed by the authoring interface on form
- Success Criteria 1.2: At least one editing method must pass the Software
Accessibility Guidelines testing criteria for each element and object property editable
by the tool
Test Plan
- tool designer lists on form all editing methods to be considered (available) for the tool
- tool designer lists on form all elements editable by authoring tool
- tool designer lists on form all object properties editable by authoring tool
- author lists on form which testing criteria are used: ISO16071 or IBM SAG
- author tries an above-listed editing method involving before-referenced elements and object properties
against selected testing criteria and lists on form which criteria are passed as well as the editing method
used and the elements and object properties edited
- Success Criteria 1.3:
- All editing views must display text equivalents for any non-text content
- All editing view must either respect operating system display settings
(for color, contrast, size, and font) or, from within the tool, provide
a means of changing color, contrast, size, and font, without affecting
the content markup)
Test Plan
- Tool designer defines all editing views supported by the tool on form (as well as non-text content types supported?)
- Tool designer defines all possible operating system display settings for color, contrast, size and font supported by tool on form
- the above-mentioned editing views are each tested by author for random (all?) samples
of non-text content to make sure text equivalent is generated for each sample, and places result on form
- the above-mentioned editing views are each tested by author against known settings for
color, contrast, size, and font, if this choice is checked on form
- the authoring tool is tested by author to see if color, contrast, size, and font can each be
changed (to known, testable verifiable) values on the above-mentioned editing views
for a reference piece of content, if this choice is checked on form
- Success criteria 2.1: All markup string written automatically be the tool (i.e.,
not authored "by hand") must conform to the applicable markup language specification
Test Plan
- on form, tool designer defines how markup strings conforming to markup language specifications are automatically generated by the tool
- on form, tool designer specifies which markup language specifications are supported by the tool
- under author control, authoring tool generates a series of markup strings automatically
- author checks that each markup string is verified against the appropriate language specification using
defined mechanisms - author lists on form markup string generated and conformance verification for that string
- if no pre-existing mechanism defined for conformance, author will explain on form how
each markup string conforms to each referenced specification mentioned before
- Success Criteria 2.2: In order to give priority to a format: that format must have a
published techniques document for meeting each Level 1 WCAG (10/27/03 draft) success criteria
(NOTE: Should this be a Relative Priority? Applicable to ATAG AA and AAA for WCAG levels 2 and 3?)
Test Plan
- Tool designer lists on form formats supported by the tool
- Author verifies on form that all of the following are true for each supported format mentioned before (or N/A if not applicable?):
- For every referenced format, there exists a published techniques document for that format
that indicates that non-text content that can be expressed in words has a text-equivalent explicitly associated with it.
- For every referenced format, there exists a published techniques document for that format
that indicates that non-text content that can not be expressed in words has a descriptive label provided as its text-equivalent.
- For every referenced format, there exists a published techniques document for that format
that indiates that an audio description is provided.
- For every referenced format, there exists a published techniques document for that format
that indicates that all significant dialogue and sounds are captioned.
- For every referenced format, there exists a published techniques document for that format
that indicates that descriptions and captions are synchronized with the events they represent.
- For every referenced format, there exists a published techniques document for that format
that indicates that if the Web content is real-time video with audio, real-time captions are provided unless the content:
is a music program that is primarily non-vocal
- For every referenced format, there exists a published techniques document for that format
that indicates that if the Web content is real-time non-interactive video (e.g., a Webcam of ambient conditions), either provide
an equivalent that conforms to items 1 and 2 of this list (e.g., an ongoing update of weather conditions) or link to an equivalent
that conforms to items 1 and 2 of this list (e.g., a link to a weather Web site).
- For every referenced format, there exists a published techniques document for that format
that indicatesthat if a pure audio or pure video presentation requires a user to respond interactively at specific times in the
presentation, then a time-synchronized equivalent (audio, visual or text) presentation is provided
- For every referenced format, there exists a published techniques document for that format
that indicates that the following can be derived programmatically (i.e. through a markup or data model that is assistive technology
compatible) from the content without requiring interpretation of presentation:
- any hierarchical elements and relationships, such as headings, paragraphs and lists
- any non-hierarchical relationships between elements such as cross-references and linkages, associations between
labels and controls, associations between cells and their headers, etc.
- any emphasis
- For every referenced format, there exists a published techniques document for that format
that indicates that any information presented through color is also available without color
- For every referenced format, there exists a published techniques document for that format
that indicates that text content is not presented over a background image or pattern OR the text is easily readable
when the page is viewed in black and white
- For every referenced format, there exists a published techniques document for that format
that indicates that text in the content is provided in Unicode or sufficient information is provided so that it can
be automatically mapped back to Unicode.
- For every referenced format, there exists a published techniques document for that format
that indicates that all of the functionality of the content, where the functionality or its outcome can be expressed
concisely in words, is operable at a minimum through a keyboard or keyboard interface.
- For every referenced format, there exists a published techniques document for that format
that indicates that content is designed so that time limits are not an essential part of interaction or
at least one of the following is true for each time limit:
the user is allowed to deactivate the time limits,
or the user is allowed to adjust the time limit over a wide range which is at least 10 times
the average user's preference,
or the user is warned before time expires and given at least 10 seconds to extend the time limit,
or the time limit is due to a real-time event (e.g. auction) and no alternative to the time limit is possible,
or the time limit is part of a competitive activity where timing is an essential part of the activity
(e.g. competitive gaming or time based testing).
- For every referenced format, there exists a published techniques document for that format
that indicates that at least one of the following is true:
- content was not designed to flicker (or flash) in the range of 3 to 49 Hz
- if flicker is unavoidable, the user is warned of the flicker before they go to the page,
and as close a version of the content as is possible without flicker is provided
- For every referenced format, there exists a published techniques document for that format
that indicates that passages or fragments of text occurring within the content that are written in a language other
than the primary natural language of the content as a whole, are identified, including specification
of the language of the passage or fragment
- For every referenced format, there exists a published techniques document for that format
that indicates that document attributes identify the natural language of the document.
- For every referenced format, there exists a published techniques document for that format
that indicates that for markup, except where the site has documented that a specification was
violated for backward compatibility, the markup has:
- passed validity tests of the language (whether it be conforming to a schema, Document Type Definition (DTD),
or other tests described in the specification)
- structural elements and attributes are used as defined in the specification
- accessibility features are used
- deprecated features are avoided
- For every referenced format, there exists a published techniques document for that format
that indicates that any custom user interface elements of the content conform to at least level A of
the User Agent Accessibility Guidelines 1.0, and that if the custom user interfaces cannot be made accessible,
an alternative solution is provided that meets WCAG2.0 to the level claimed.
- Author provides a link to each techniques document mentioned above on the form for each supported format
- Success Criteria 2.3: Tools must always meet at least one of the following:
- generate accessible content automatically
- provide a method for authoring "by hand"
- provide the author with accessible options for every authoring task
Test Plan
- tool designer declares the capability of their authoring tool on a form by checking one/more of three boxes corresponding
to three items in success criteria
- if appropriate item is checked, tool designer explains on form how function is accomplished
- author states on the form whether the tool successfully generates accessible content automatically,
, if that item is checked on the form, and if so, what times it does this, OR
- author states on the form whether the tool successfully provides a method for authoring "by hand",
, if that item is checked on the form, and if so, what times it does this, OR
- author states on the form whether the tool provides accessible options for every authoring task,
, if that item is checked on the form, and if so, what times it does this
- for each of the above, author provides a description of the authoring task and the resulting content
- Success Criteria 2.4:
- During all transformations and conversions, any accessibility information
must be preserved, unless prevented by limitations of the target format
- When accessibility information cannot be preserved during a conversion or
transformation, the author must be notified beforehand.
Test Plan
- tool designer defines all transformations and conversions possible using their tool on a form
- tool designer defines all accessibility information that can be provided by their tool on a form
- tool designer lists any limitations or restrictions imposed by the target format on a form
- author tries some sample transformations, and defines any accessibility information pertinent to
that transformation on a form; if no accessibility information exists, it should be so stated
- author compares the defined accessibility information after to that
before to make sure the same information exists, and presents the results on a form, for each transformation
- author would state on the form whether prior author notification was given if the previous tests gave
negative results (prior meaning before the transformation was attempted)
- author verifies that if prior notification was made, author was given choice to abort transformation,
and if author chose to abort, the transformation was in fact not attempted; this info is presented on a form
- Success Criteria 3.4:
- When the author inserts an unrecognized non-text object, the tool must not insert an
automatically generated text equivalent (e.g. label generated from the file name)
- When the author inserts a non-text object for which the tool has a previously authored
equivalent (i.e. created by the author, tool designer, pre-authored content developer, etc.),
but the function of the object is not known with certainty, the tool must prompt the author
to confirm insertion of the equivalent. However, where the function of the non-text object
is known with certainty (e.g. "home button" on a navigation bar, etc.), the tool may
automatically insert the equivalent.
Test Plan
- tool designer defines capability of tool re: handling of unrecognized non-text objects on a form
- Author in editing using tool inserts unrecognized non-text object
- Author verifies for every such insertion that tool does not insert text equivalent and presents this info on form
- tool designer defines all known non-text objects for which text equivalents exists, and gives on a form the object,
text equivalent (or link to), and function of the object, on a form
- author inserts non-text object on list mentioned before, verifies that tool automatically inserts correct text equivalent,
and provides this info on a form
- author inserts non-text object not on list mentioned before, and verifies that tool does not insert equivalent, but
prompts the author before equivalent is inserted; this info is provided on a form
- author verifies that for such prompting, if author accepts, tool does in fact insert text equivalent, and if author
declines, the tool does not insert text equivalent; this info is provided on a form
- Success Criteria 3.7: All features that play a role in creating accessible content must be
documented in the help system.
Test Plan
- tool designer lists on form all features assisting accessibility of content generated by the tool,
and where those features are in the help system
- tool designer lists on form nature of help system (documentation) for the tool, and how to use help system.
- author verifies on form that for every feature listed, feature is found in help system at the correct location
checking that feature is included in the documentation.
- Author verifies on form for each feature a level of understanding of the feature's capabilities
- Success Criteria 4.2:
- Continuously active processes (e.g. a checker that underlines errors as they occur, a
checker that activates at a save, a checker that every 10 minutes, etc.) that implement
functions required by checkpoints 3.1, 3.2, 3.3, and 3.7 must be enabled by default
- If the author chooses to disable these continuously active processes, then the tool
must inform the author of the consequences of their choice
- User-initiated processes (e.g. a checker that the user requests each time) that implement
functions required by checkpoints 3.1, 3.2, 3.3 and 3.7 must be available to the author
at "all times during authoring" with no more steps than other "high-priority functions"
- When the functions required by checkpoints 3.1, 3.2, 3.3 and 3.7 are combined with other
authoring functions (e.g., an accessibility-related field in a general purpose dialog box),
the resulting design must include all accessibility-related functions in the top level
of the combined user interface
Test Plan
- tool designer lists on form all continuously active processes implementing functions required by
checkpoints 3.1, 3.2, 3.3, and 3.7 and supported by the tool, as well as how they operate
- author verifies on form that all such processes listed in fact work correctly as described
- tool designer lists on form how to disable any of previously-mentioned processes
- author verifies on form that for all such processes listed, author is given a choice by tool as to
whether to disable using knowledge provided by tool designer, and that each choice gives consequences to
the author
- author verifies on form that if choice is made to disable such a process (using information given previously), the
process is successfully disabled
- author verifies on form that if choice is not made to disable such a process (using information given previously), the
process is still enabled (works correctly)
- tool designer lists on form all user-initiated processes available in the tool supporting checkpoints
3.1, 3.2, 3.3, 3.7, the number of steps required to access each process, and the certification that the number of
steps is less than or equal to other listed high-priority functions
- author verifies on form that when each of these processes is tried, they work correctly every time using the
number of steps indicated by the tool designer
- tool designer lists on form all 3.1, 3.2, 3.3, 3.7 functions that may be combined with other authoring
functions (list) supported by the tool
- tool designer lists on form characteristics of design of top level of user interface of the tool
- author verifies that all of 3.1, 3.2, 3.3, 3.7 functions are immediately available, locatable at the top level
of user interface of the tool using previous information, and (work correctly?).
------------------------
- Success Criteria 2.5: All markup strings written automatically by the tool (i.e.,
not authored "by hand") must satisfy all of the WCAG2.0 (10/27/03 draft) Level 1 success criteria(RELATIVE PRIORITY):
Test Plan
- Tool designer enters on form how markup strings are written by the authoring tool.
- Using this info, author (tester) tests all of the following (or enters N/A as appropriate) and enters results for each item on form:
- Tester verifies that for every markup string written automatically by the tool, non-text content that can be expressed in words has a text-equivalent explicitly associated with it.
- Tester verifies that for every markup string written automatically by the tool, non-text content that can not be expressed in words has a descriptive label provided as its text-equivalent.
- Tester verifies that for every markup string written automatically by the tool, an audio description is provided.
- Tester verifies that for every markup string written automatically by the tool, all significant dialogue and sounds are captioned.
- Tester verifies that for every markup string written automatically by the tool, descriptions and captions are synchronized with the events they represent.
- Tester verifies that for every markup string written automatically by the tool, if the Web content is real-time video with audio, real-time captions are provided unless the content:
is a music program that is primarily non-vocal
- Tester verifies that for every markup string written automatically by the tool, if the Web content is real-time non-interactive video (e.g., a Webcam of ambient conditions), either provide
an equivalent that conforms to items 1 and 2 of this list (e.g., an ongoing update of weather conditions) or link to an equivalent
that conforms to items 1 and 2 of this list (e.g., a link to a weather Web site).
- Tester verifies that for every markup string written automatically by the tool, if a pure audio or pure video presentation requires a user to respond interactively at specific times in the
presentation, then a time-synchronized equivalent (audio, visual or text) presentation is provided
- Tester verifies that for every markup string written automatically by the tool, the following can be derived programmatically (i.e. through a markup or data model that is assistive technology
compatible) from the content without requiring interpretation of presentation:
- any hierarchical elements and relationships, such as headings, paragraphs and lists
- any non-hierarchical relationships between elements such as cross-references and linkages, associations between
labels and controls, associations between cells and their headers, etc.
- any emphasis
- Tester verifies that for every markup string written automatically by the tool, any information presented through color is also available without color
- Tester verifies that for every markup string written automatically by the tool, text content is not presented over a background image or pattern OR the text is easily readable
when the page is viewed in black and white
- Tester verifies that for every markup string written automatically by the tool, text in the content is provided in Unicode or sufficient information is provided so that it can
be automatically mapped back to Unicode.
- Tester verifies that for every markup string written automatically by the tool, all of the functionality of the content, where the functionality or its outcome can be expressed
concisely in words, is operable at a minimum through a keyboard or keyboard interface.
- Tester verifies that for every markup string written automatically by the tool, content is designed so that time limits are not an essential part of interaction or
at least one of the following is true for each time limit:
the user is allowed to deactivate the time limits,
or the user is allowed to adjust the time limit over a wide range which is at least 10 times
the average user's preference,
or the user is warned before time expires and given at least 10 seconds to extend the time limit,
or the time limit is due to a real-time event (e.g. auction) and no alternative to the time limit is possible,
or the time limit is part of a competitive activity where timing is an essential part of the activity
(e.g. competitive gaming or time based testing).
- Tester verifies that for every markup string written automatically by the tool, at least one of the following is true:
- content was not designed to flicker (or flash) in the range of 3 to 49 Hz
- if flicker is unavoidable, the user is warned of the flicker before they go to the page,
and as close a version of the content as is possible without flicker is provided
- Tester verifies that for every markup string written automatically by the tool, passages or fragments of text occurring within the content that are written in a language other
than the primary natural language of the content as a whole, are identified, including specification
of the language of the passage or fragment
- Tester verifies that for every markup string written automatically by the tool, document attributes identify the natural language of the document.
- Tester verifies that for every markup string written automatically by the tool, for markup, except where the site has documented that a specification was
violated for backward compatibility, the markup has:
- passed validity tests of the language (whether it be conforming to a schema, Document Type Definition (DTD),
or other tests described in the specification)
- structural elements and attributes are used as defined in the specification
- accessibility features are used
- deprecated features are avoided
- Tester verifies that for every markup string written automatically by the tool, any custom user interface elements of the content conform to at least level A of
the User Agent Accessibility Guidelines 1.0, and that if the custom user interfaces cannot be made accessible,
an alternative solution is provided that meets WCAG2.0 to the level claimed.
- Success Criteria 2.6: Any web content (e.g., templates, clip art, multimedia objects,
scripts, applets, example pages, etc.) preferentially licensed (i.e., better terms
for users of tool then for other) for users of the tool, must satisfy all of the
Level 1 WCAG2.0 (10/27/03 draft) success criteria
(RELATIVE PRIORITY):
Test Plan
- Tool designer defines all content preferentially licensed for the tool (and what preferential licensing means) on a form
- Using this info, author (tester) tests all of the following and enters result for each (or N/A as appropriate) on a form:
- Tester verifies that for every instance of such content, non-text content that can be expressed in words has a text-equivalent explicitly associated with it.
- Tester verifies that for every instance of such content, non-text content that can not be expressed in words has a descriptive label provided as its text-equivalent.
- Tester verifies that for every instance of such content, an audio description is provided.
- Tester verifies that for every instance of such content, all significant dialogue and sounds are captioned.
- Tester verifies that for every instance of such content, descriptions and captions are synchronized with the events they represent.
- Tester verifies that for every instance of such content, if the Web content is real-time video with audio, real-time captions are provided unless the content:
is a music program that is primarily non-vocal
- Tester verifies that for every instance of such content, if the Web content is real-time non-interactive video (e.g., a Webcam of ambient conditions), either provide
an equivalent that conforms to items 1 and 2 of this list (e.g., an ongoing update of weather conditions) or link to an equivalent
that conforms to items 1 and 2 of this list (e.g., a link to a weather Web site).
- Tester verifies that for every instance of such content, if a pure audio or pure video presentation requires a user to respond interactively at specific times in the
presentation, then a time-synchronized equivalent (audio, visual or text) presentation is provided
- Tester verifies that for every instance of such content, the following can be derived programmatically (i.e. through a markup or data model that is assistive technology
compatible) from the content without requiring interpretation of presentation:
- any hierarchical elements and relationships, such as headings, paragraphs and lists
- any non-hierarchical relationships between elements such as cross-references and linkages, associations between
labels and controls, associations between cells and their headers, etc.
- any emphasis
- Tester verifies that for every instance of such content, any information presented through color is also available without color
- Tester verifies that for every instance of such content, text content is not presented over a background image or pattern OR the text is easily readable
when the page is viewed in black and white
- Tester verifies that for every instance of such content, text in the content is provided in Unicode or sufficient information is provided so that it can
be automatically mapped back to Unicode.
- Tester verifies that for every instance of such content, all of the functionality of the content, where the functionality or its outcome can be expressed
concisely in words, is operable at a minimum through a keyboard or keyboard interface.
- Tester verifies that for every instance of such content, content is designed so that time limits are not an essential part of interaction or
at least one of the following is true for each time limit:
the user is allowed to deactivate the time limits,
or the user is allowed to adjust the time limit over a wide range which is at least 10 times
the average user's preference,
or the user is warned before time expires and given at least 10 seconds to extend the time limit,
or the time limit is due to a real-time event (e.g. auction) and no alternative to the time limit is possible,
or the time limit is part of a competitive activity where timing is an essential part of the activity
(e.g. competitive gaming or time based testing).
- Tester verifies that for every instance of such content, at least one of the following is true:
- content was not designed to flicker (or flash) in the range of 3 to 49 Hz
- if flicker is unavoidable, the user is warned of the flicker before they go to the page,
and as close a version of the content as is possible without flicker is provided
- Tester verifies that for every instance of such content, passages or fragments of text occurring within the content that are written in a language other
than the primary natural language of the content as a whole, are identified, including specification
of the language of the passage or fragment
- Tester verifies that for every instance of such content, document attributes identify the natural language of the document.
- Tester verifies that for every instance of such content, for markup, except where the site has documented that a specification was
violated for backward compatibility, the markup has:
- passed validity tests of the language (whether it be conforming to a schema, Document Type Definition (DTD),
or other tests described in the specification)
- structural elements and attributes are used as defined in the specification
- accessibility features are used
- deprecated features are avoided
- Tester verifies that for every instance of such content, any custom user interface elements of the content conform to at least level A of
the User Agent Accessibility Guidelines 1.0, and that if the custom user interfaces cannot be made accessible,
an alternative solution is provided that meets WCAG2.0 to the level claimed.
- Success Criteria 3.1:
- When the actions of the author risk creating accessibility problems (not satisfying
any of the WCAG2.0 (10/27/03) Level 1
success criteria), the tool must intervene to introduce the appropriate accessible
authoring practice. This intervention may proceed according to a user-configurable schedule.
- The intervention must occur at least once before completion of authoring (e.g., final save,
publishing, etc.)
(RELATIVE PRIORITY):
Test Plan
- tool designer describes all intervention features of tool and accessibility issues prompting such intervention on a form
- tool designer describes how a user could configure the schedule of intervention
- Using this info, author tests all of the following and enters results on a form (or enters N/A as appropriate for each item)
(NOTE: x is a small number that is assumed always to occur before completion of authoring):
- If non-text content that can be expressed in words does not have a text-equivalent explicitly associated with it, the author is notified within x seconds of the event.
- If non-text content that can not be expressed in words does not have a descriptive label provided as its text-equivalent. the author is notified within x seconds of the event.
- If it is determined that an audio description is not provided of all significant visual information in scenes, actions, and events that
cannot be perceived from the sound track alone to the extent possible given the constraints posed by the existing
audio track and limitations on freezing the audio visual program to insert additional auditory description, the author is notified within x seconds of the determination.
- If it is determined that all significant dialogue and sounds are captioned.(EXCEPTION: If the Web content is real-time and audio-only and
not time-sensitive and not interactive a transcript or other non-audio equivalent is sufficient), then the author is notified within x seconds of the event.
- If it is determined that descriptions and captions are not synchronized with the events they represent, then the author is notified within x seconds of the event.
- If it is determined that if the Web content is real-time video with audio, real-time captions are not provided (unless the content:
is a music program that is primarily non-vocal), then the author is notified within x seconds of the event.
- If it is determined that if the Web content is real-time non-interactive video (e.g., a Webcam of ambient conditions), neither is provided:
an equivalent that conforms to items 1 and 2 of this list (e.g., an ongoing update of weather conditions) or link to an equivalent
that conforms to items 1 and 2 of this list (e.g., a link to a weather Web site), the author is notified x seconds after the event.
- If it is determined that if a pure audio or pure video presentation requires a user to respond interactively at specific times in the
presentation, then a time-synchronized equivalent (audio, visual or text) presentation is not provided, then the author is notified within x seconds of the event.
- If it is determined that the following can not be derived programmatically (i.e. through a markup or data model that is assistive technology
compatible) from the content without requiring interpretation of presentation, then the author is notified within x seconds after the event.
- any hierarchical elements and relationships, such as headings, paragraphs and lists
- any non-hierarchical relationships between elements such as cross-references and linkages, associations between
labels and controls, associations between cells and their headers, etc.
- any emphasis
- If it is determined that any information presented through color is not available without color, the author is notified x seconds after the determination.
- If it is determined that text content is presented over a background image or pattern and the text is not easily readable when the page is viewed in
black and white, the author is notified within x seconds of the determination.
- If it is determined that text in the content is not provided in Unicode or sufficient information is not provided so that it can
be automatically mapped back to Unicode, then the author is notified x seconds after the event.
- If it is determined that all of the functionality of the content, where the functionality or its outcome can be expressed
concisely in words, is not operable at a minimum through a keyboard or keyboard interface, the author is notified x seconds after the event.
- If it is determined that content is not designed so that time limits are not an essential part of interaction and at least one of the following is not true for each time limit, the author is notified x seconds after the event.
the user is allowed to deactivate the time limits,
or the user is allowed to adjust the time limit over a wide range which is at least 10 times
the average user's preference,
or the user is warned before time expires and given at least 10 seconds to extend the time limit,
or the time limit is due to a real-time event (e.g. auction) and no alternative to the time limit is possible,
or the time limit is part of a competitive activity where timing is an essential part of the activity
(e.g. competitive gaming or time based testing).
- If it is determined that at least one of the following is not true, then the author is notified x seconds after the event:
- content was not designed to flicker (or flash) in the range of 3 to 49 Hz
- if flicker is unavoidable, the user is warned of the flicker before they go to the page,
and as close a version of the content as is possible without flicker is provide
- If it is determined that passages or fragments of text occurring within the content that are written in a language other
than the primary natural language of the content as a whole, are not identified, including specification
of the language of the passage or fragment, then the author is notified x seconds after the event.
- If it is determined that document attributes do not ifentify the natural language of the document, the author is notified x seconds after the determination.
- If it is determined that for markup, except where the site has documented that a specification was violated for backward compatibility, the markup has not
satisfied items below, the author is notified x seconds after the event:
- passed validity tests of the language (whether it be conforming to a schema, Document Type Definition (DTD),
or other tests described in the specification)
- structural elements and attributes are used as defined in the specification
- accessibility features are used
- deprecated features are avoided
- If it is determined that any custom user interface elements fo the content do not conform to at least Level A of the User Agent Accessibility
Guidelines 1.0, and if it is determined that if the custom user interfaces cannot be made accessible, an alternative solution is not provided that
meets WCAG2.0 (including this provision) to the level claimed, the author is notified x seconds after the determination.
within x seconds of the event.
- Success Criteria 3.2: The tool must provide a check (automated check, semi-automated check or
manual check) for detecting violations of each Level 1 success criteria of WCAG2.0 (10/27/03 draft)(RELATIVE PRIORITY):
Test Plan
- tool designer describes on form all checks supported by the tool for violation detection, what kind of check each is, and how authors would
recognize or use info provided by each check
- given this info, author tests each following item and enters results on a form (or N/A for items as appropriate):
- If non-text content that can be expressed in words does not have a text-equivalent explicitly associated with it, tester verifies that the tool provides a check for this.
- If non-text content that can not be expressed in words does not have a descriptive label provided as its text-equivalent, tester verifies that the tool provides a check for this.
- If an audio description is not provided, the tester verifies that a check is provided by the tool for this.
- If all significant dialogue and sounds are not captioned, the tester verifies that a check is provided by the tool for this.
- If descriptions and captions are not synchronized with the events they represent, the tester verifies that a check is provided by the tool for this.
- If the Web content is real-time video with audio, real-time captions are not provided (unless the content is a music program that is primarily non-vocal),
the tester verifies that a check is provided by the tool for this.
- If the Web content is real-time non-interactive video (e.g., a Webcam of ambient conditions), neither is provided:
an equivalent that conforms to items 1 and 2 of this list (e.g., an ongoing update of weather conditions) or link to an equivalent
that conforms to items 1 and 2 of this list (e.g., a link to a weather Web site), the tester verifies that a check is provided by the tool for this.
- If a pure audio or pure video presentation requires a user to respond interactively at specific times in the
presentation, then a time-synchronized equivalent (audio, visual or text) presentation is not provided, the tester verifies that a check is provided by the tool for this.
- If the following can not be derived programmatically (i.e. through a markup or data model that is assistive technology
compatible) from the content without requiring interpretation of presentation, the tester verifies that a check is provided by the tool:
- any hierarchical elements and relationships, such as headings, paragraphs and lists
- any non-hierarchical relationships between elements such as cross-references and linkages, associations between
labels and controls, associations between cells and their headers, etc.
- any emphasis
- If any information presented through color is not available without color, the tester verifies that a check is provided by the tool for this.
- If text content is not presented over a background image or pattern OR the text is not easily readable
when the page is viewed in black and white, the tester verifies that a check is provided by the tool.
- If text in the content is not provided in Unicode or sufficient information is not provided so that it can
be automatically mapped back to Unicode, the tester verifies that a check is provided by the tool for this.
- If all of the functionality of the content, where the functionality or its outcome can be expressed
concisely in words, is not operable at a minimum through a keyboard or keyboard interface, the tester verifites that a check is provided by the tool for this.
- If content is designed so that time limits are not an essential part of interaction or
none of the following is true for each time limit:
the user is allowed to deactivate the time limits,
or the user is allowed to adjust the time limit over a wide range which is at least 10 times
the average user's preference,
or the user is warned before time expires and given at least 10 seconds to extend the time limit,
or the time limit is due to a real-time event (e.g. auction) and no alternative to the time limit is possible,
or the time limit is part of a competitive activity where timing is an essential part of the activity
(e.g. competitive gaming or time based testing), the tester verifies that a check is provided for this.
- If none of the following is true, the tester verifies that a check is provided for this:
- content was not designed to flicker (or flash) in the range of 3 to 49 Hz
- if flicker is unavoidable, the user is warned of the flicker before they go to the page,
and as close a version of the content as is possible without flicker is provided
- If passages or fragments of text occurring within the content that are written in a language other
than the primary natural language of the content as a whole, are not identified, including specification
of the language of the passage or fragment, the tester verifies that a check is provided for this.
- If document attributes do not identify the natural language of the document, the tester verifies that a check
is provided for this.
- If for markup, except where the site has documented that a specification was
violated for backward compatibility, the markup has not done any below, the tester verifies that a check is
made for this:
- passed validity tests of the language (whether it be conforming to a schema, Document Type Definition (DTD),
or other tests described in the specification)
- structural elements and attributes are used as defined in the specification
- accessibility features are used
- deprecated features are avoided
- If any custom user interface elements of the content do not conform to at least level A of
the User Agent Accessibility Guidelines 1.0, and if when the custom user interfaces cannot be made accessible,
an alternative solution is not provided that meets WCAG2.0 to the level claimed, the tester verifies that a check
is made by the tool for this.
- Success Criteria 3.3: The tool must provide a repair (automated repair, semi-automated repair or
manual repair) for correcting violations of each Level 1 success criteria of WCAG2.0 (10/27/03 draft)(RELATIVE PRIORITY):
Test Plan
- tool designer describes on form all repairs provided for violation correction by the tool, and what kind of repair it is
- given this info, author tests each item below to ensure repair successfully made (or enters N/A for item as appropriate), and enters results on form:
- If non-text content that can be expressed in words does not have a text-equivalent explicitly associated with it, tester verifies that the tool successfully repairs this problem.
- If non-text content that can not be expressed in words does not have a descriptive label provided as its text-equivalent, tester verifies that the tool successfully repairs this problem.
- If an audio description is not provided, the tester verifies that a check is provided by the tool for this.
- If all significant dialogue and sounds are not captioned, the tester verifies that the tool successfully repairs this problem.
- If descriptions and captions are not synchronized with the events they represent, the tester verifies that the tool successfully repairs this problem.
- If the Web content is real-time video with audio, real-time captions are not provided (unless the content is a music program that is primarily non-vocal),
the tester verifies that the tool successfully repairs this problem.
- If the Web content is real-time non-interactive video (e.g., a Webcam of ambient conditions), neither is provided:
an equivalent that conforms to items 1 and 2 of this list (e.g., an ongoing update of weather conditions) or link to an equivalent
that conforms to items 1 and 2 of this list (e.g., a link to a weather Web site), the tester verifies that the tool successfully repairs this problem.
- If a pure audio or pure video presentation requires a user to respond interactively at specific times in the
presentation, then a time-synchronized equivalent (audio, visual or text) presentation is not provided, the tester verifies that the tool successfully repairs this problem.
- If the following can not be derived programmatically (i.e. through a markup or data model that is assistive technology
compatible) from the content without requiring interpretation of presentation, the tester verifies that a successful repair is provided by the tool:
- any hierarchical elements and relationships, such as headings, paragraphs and lists
- any non-hierarchical relationships between elements such as cross-references and linkages, associations between
labels and controls, associations between cells and their headers, etc.
- any emphasis
- If any information presented through color is not available without color, the tester verifies that the tool successfully repairs this problem.
- If text content is not presented over a background image or pattern OR the text is not easily readable
when the page is viewed in black and white, the tester verifies that the tool successfully repairs this problem.
- If text in the content is not provided in Unicode or sufficient information is not provided so that it can
be automatically mapped back to Unicode, the tester verifies that the tool successfully repairs this problem.
- If all of the functionality of the content, where the functionality or its outcome can be expressed
concisely in words, is not operable at a minimum through a keyboard or keyboard interface, the tester verifites that the tool successfully repairs this problem.
- If content is designed so that time limits are not an essential part of interaction or
none of the following is true for each time limit:
the user is allowed to deactivate the time limits,
or the user is allowed to adjust the time limit over a wide range which is at least 10 times
the average user's preference,
or the user is warned before time expires and given at least 10 seconds to extend the time limit,
or the time limit is due to a real-time event (e.g. auction) and no alternative to the time limit is possible,
or the time limit is part of a competitive activity where timing is an essential part of the activity
(e.g. competitive gaming or time based testing), then the tester verifies that the tool successfully repairs this problem.
- If none of the following is true, the tester verifies that the tool successfully repairs this:
- content was not designed to flicker (or flash) in the range of 3 to 49 Hz
- if flicker is unavoidable, the user is warned of the flicker before they go to the page,
and as close a version of the content as is possible without flicker is provided
- If passages or fragments of text occurring within the content that are written in a language other
than the primary natural language of the content as a whole, are not identified, including specification
of the language of the passage or fragment, the tester verifies that the tool successfully repairs this.
- If document attributes do not identify the natural language of the document, the tester verifies that the tool
successfully repairs this.
- If for markup, except where the site has documented that a specification was
violated for backward compatibility, the markup has not done any below, the tester verifies that a successful repair is
made by the tool for this:
- passed validity tests of the language (whether it be conforming to a schema, Document Type Definition (DTD),
or other tests described in the specification)
- structural elements and attributes are used as defined in the specification
- accessibility features are used
- deprecated features are avoided
- If any custom user interface elements of the content do not conform to at least level A of
the User Agent Accessibility Guidelines 1.0, and if when the custom user interfaces cannot be made accessible,
an alternative solution is not provided that meets WCAG2.0 to the level claimed, the tester verifies that a successful
repair is made by the tool for this problem.
--------------------------------------------------------------------------------------------------
- Success Criteria 1.4:
- In any element hierarchy, the author must be able to move editing focus
from any structural element to any element immediately above, immediately
below, immediately below or in the same level in the hierarchy
- In any element hierarchy, the author must be able to select, copy, cut and
past any element with its content
Test Plan
- Tool designer defines on form how editing focus is moved for the tool (and document structures supported?)
- For every such hierarchy and editing focus, the tester (author) verifies that focus is correctly moved to element
immediately above in hierarchy and enters results on form
- For every such hierarchy and editing focus, the tester (author) verifies that focus is correctly moved to element
immediately below in hierarchy and enters results on form
- For every such hierarchy and editing focus, the tester (author) verifies that focus is correctly moved to element
immediately to left or right in hierarchy and enters results on form
- Tool designer defines element hierarchies supported on form?
- Given such an element hierarchy, the tester (author) verifies that every element (with its content) in that
hierarchy can be selected by the tool, and enters results on form.
- Given such an element hierarchy, the tester (author) verifies that every element (with its content) in that
hierarchy can be copied by the tool, and enters results on form.
- Given such an element hierarchy, the tester (author) verifies that every element (with its content) in that
hierarchy can be cut by the tool, and enters results on form.
- Given such an element hierarchy, the tester (author) verifies that every element (with its content) in that
hierarchy can be pasted by the tool, and enters results on form.
- Success criteria 1.5:
- The authoring tool must have a search function for all editing views
- The author must be able to search for text within all text equivalents of
any rendered non-text content
- The author must be able to specify whether to search content, markup, or both
Test Plan
- The tool designer defines all editing views supported by the tool, and all search functions supported by the tool, on a form
- Given this information, the tester (author) verifies that at least one search function exists for a sample of editing views as defined above,
and enters results on form
- Given this information, the tester (author) verifies that all samples of text within every text equivelent are searched for successfully,
and enters results on form
- Given this information, the author verifies that the author is successfully able to distinguish searching for content vs. searching for markup,
and enters results on a form.
- Given this information, the tester (author) verifies that all searching mentioned previously is performed successfully,
and enters results on a form.
- Success Criteria 2.7: When unrecognized markup (e.g. external entity, unrecognized element
or attribute name) is detected, the tool must query the author for consent to modify
the markup. If the author refuses, and the markup cannot be processed, the tool must
refuse to open the markup for editing.
Test Plan
- The tool designer defines what unrecognized markup is in the context of the tool (documentation) on a form
- The tool designer defines how unrecognized markup is detected by the tool, and what actions the tool takes in those instances, on a form
- Given this information, the tester (author) verifies that a query is always successfully directed at the author in every such instance,
and enters results on a form.
- Given this information, the tester (author) verifies that each such query successfully gives the author a valid choice as to whether to modify the markup,
and enters results on form.
- Given this information, the tester (author) verifies that each time the author refuses such a query, the tool does not open the markup for editing, and
enters the results on form.
- Given this information, the tester (author) verifies that each time the author accepts such a query, the tool does open the markup for editing, and
enters the results on form.
- Success Criteria 3.8: All examples of markup code and views of the user interface (dialog
screenshots, etc.) must satisfy all Level 2 success criteria below of WCAG2.0 (10/27/03 draft), regardless of whether the examples
are intended to demonstrate accessibility authoring practices (NOTE: Should this be
relative priority since it refers to WCAG?):
Test Plan
- The tool designer defines all types of markup code and all views of the user interface supported by the tool on a form
- Given this information, author (tester) successfully verifies each following item (or N/A) and enters results on a form:
- The tester verifies that subject to previous info non-text content that cannot be expressed in words always has a text
equivalent for all aspects that can be expressed in words
- The tester verifies that subject to previous info a text document that merge all audio descriptions and captions
into a collated script is always provided.
- The tester verifies that subject to previous info captions and audio descriptiosn are always provided for all live broadcasts.
- The tester verifies that subject to previous info any information presented using color is always available without color and
without having to interpret markup.
- The tester verifies that subject to previous info all abbreviations and acronyms are clearly identified each time they occur if
they collide with a word in the standard language that would also logically appear in the same case
- The tester verifies that subject to previous info all symbols such as diacritic marks that are found in standard usage of the
natural language of the content, and that are necessary for unambiguous identification of words, are always present or anoth
standard mechanism for disambiguation is always provided.
- The tester verifies that subject to previous info all structural elements present have a different visual appearance or auditory
characteristic from each other and from body text.
- The tester verifies that subject to previous info all text that is presneted over a background color or grayscale has a mechanism
that allows the text to be presented in a fashion that has a "large" contrast between text and background color.
- The tester verifies that subject to previous info all audio content does not contain background sounds or the background sounds
are at least 20db lower than the foreground audio content.
- The tester verifies that subject to previous info that werever a choice between event handlers is available and supported, the more
abstract event is always used.
- The tester verifies that subject to previous info that any blinking content can always be turned off.
- The tester verifies that subject to previous info that any moving content can always be paused.
- The tester verifies that subject to previous info all animation or other content does not visibly or purposely flicker between 3 and 49Hz.
- The tester verifies that subject to previous info all content that might create a problem has been tested, and only pages with unavoidable
flicker remain and appropriate warnings along with a close alternative presentation have been provided for these pages.
- The tester verifies that subject to previous info that in all documents greater than 50000 words or all sites larger than 50 perceived pages,
at least one of the following is provided: hierarchical structure markup, table of contents, and alternate display orders
- The tester verifies that subject to previous info users are always able to skip over large blocks of repetitive material, navigational bars
or other blocks of links that are greater than 7 when reading with a synthesized or navigating using keyboard.
- The tester verifies that subject to previous info if an error is detected, feedback is always provided to the user identifying the error.
- The tester verifies that subject to previous info all acronyms and abbreviations do not appear first in standard unabridged dictionaries
for the language, or are always defined the first time they appear, or are always available in a glossary on the site.
- The tester verifies that subject to previous info all content has been reviewed, taking into account the following strategies for
evaluating the complexity of content, applying as appropriate: familiarity of terms and language structure, reasonableness of length
and complexity of sentences, coherence of paragraphs (and sensibility in length), clarity of headings and linked text when read out of
context, accuracy and uniqueness of page titles, care in the use of all-capital letters where normal sentences case might increase
comprehension, inclusion of non-text content to supplement text for key pages or sections of the site where appropriate.
- The tester verifies that subject to previous info all key orientation adn navigational elements are generally found in one or two consistent
locations or their locations are always otherwise predictable
- The tester verifies that subject to previous info that hwere inconsistent or unpredictable responses are essential to the function of the
content, the user is always warned in advance of encountering them
- The tester verifies that subject to previous info wherever there are extreme changes in context, one of the following is always true: an easy-
to-find setting, that persists for the site visit, is provided fot he user to deactivate processes or features that cause extreme changes
in context, or extreme changes in context are identified before they occur so they can be prepared for the change
- The tester verifies that subject to previous info for all markup, the markup has: passed validity tests of the language, structural elements
and attributes are used as defined in the specification, accessbility features are used, and deprecated features are avoided
- The tester verifies that subject to previous info all accessibility conventions of the markup or programming language are always used
- The tester verifies that subject to previous info all relevant interfaces have been tested using a variety of assistive technologies (and
preferably real people with disabilities) to determine that those assistive technologies are always able to acccess all information
on the page or hidden within the page
- The tester verifies that subject to previous info all applicable Web resources include a list of the technologies users must have in order for
its content to work as intended
- Success Criteria 4.1:
- When a tool provides a means for markup to be added with a single mouse click or keystroke,
that markup must satisfy all of the Level 2 success criteria of WCAG2.0 (10/27/03 draft) unless the markup was authored "by hand".
- When an authoring action has several markup implementations (e.g., changing the color of text
with presentation markup or style sheets), those markup implementations that satisfy all of the Level 2
success criteria of WCAG2.0 (10/28/03 draft) must be equal
to or higher on all of the following scales than those
markup implementations that do not meet the above WCAG2.0 requirements:
- prominence of location (in "power tools" such as floating menus, toolbars, etc.)
- position in layout (top to bottom and left to right in menus, dialog boxes, etc.)
- size of control (measured as screen area)
- actions to activate (number of mouse clicks or keystrokes)
(NOTE: Should this be relative priority?)
Test Plan:
- The tool designer defines all ways supported by the tool of adding markup with a single mouse click or keystroke,
and enters results on form
- Given this information (for each way and all markup examples), author (tester) successfully verifies each following item (or N/A as appropriate) and enters results on a form:
- The tester verifies that subject to previous info non-text content that cannot be expressed in words always has a text
equivalent for all aspects that can be expressed in words
- The tester verifies that subject to previous info a text document that merge all audio descriptions and captions
into a collated script is always provided.
- The tester verifies that subject to previous info captions and audio descriptiosn are always provided for all live broadcasts.
- The tester verifies that subject to previous info any information presented using color is always available without color and
without having to interpret markup.
- The tester verifies that subject to previous info all abbreviations and acronyms are clearly identified each time they occur if
they collide with a word in the standard language that would also logically appear in the same case
- The tester verifies that subject to previous info all symbols such as diacritic marks that are found in standard usage of the
natural language of the content, and that are necessary for unambiguous identification of words, are always present or anoth
standard mechanism for disambiguation is always provided.
- The tester verifies that subject to previous info all structural elements present have a different visual appearance or auditory
characteristic from each other and from body text.
- The tester verifies that subject to previous info all text that is presneted over a background color or grayscale has a mechanism
that allows the text to be presented in a fashion that has a "large" contrast between text and background color.
- The tester verifies that subject to previous info all audio content does not contain background sounds or the background sounds
are at least 20db lower than the foreground audio content.
- The tester verifies that subject to previous info that werever a choice between event handlers is available and supported, the more
abstract event is always used.
- The tester verifies that subject to previous info that any blinking content can always be turned off.
- The tester verifies that subject to previous info that any moving content can always be paused.
- The tester verifies that subject to previous info all animation or other content does not visibly or purposely flicker between 3 and 49Hz.
- The tester verifies that subject to previous info all content that might create a problem has been tested, and only pages with unavoidable
flicker remain and appropriate warnings along with a close alternative presentation have been provided for these pages.
- The tester verifies that subject to previous info that in all documents greater than 50000 words or all sites larger than 50 perceived pages,
at least one of the following is provided: hierarchical structure markup, table of contents, and alternate display orders
- The tester verifies that subject to previous info users are always able to skip over large blocks of repetitive material, navigational bars
or other blocks of links that are greater than 7 when reading with a synthesized or navigating using keyboard.
- The tester verifies that subject to previous info if an error is detected, feedback is always provided to the user identifying the error.
- The tester verifies that subject to previous info all acronyms and abbreviations do not appear first in standard unabridged dictionaries
for the language, or are always defined the first time they appear, or are always available in a glossary on the site.
- The tester verifies that subject to previous info all content has been reviewed, taking into account the following strategies for
evaluating the complexity of content, applying as appropriate: familiarity of terms and language structure, reasonableness of length
and complexity of sentences, coherence of paragraphs (and sensibility in length), clarity of headings and linked text when read out of
context, accuracy and uniqueness of page titles, care in the use of all-capital letters where normal sentences case might increase
comprehension, inclusion of non-text content to supplement text for key pages or sections of the site where appropriate.
- The tester verifies that subject to previous info all key orientation adn navigational elements are generally found in one or two consistent
locations or their locations are always otherwise predictable
- The tester verifies that subject to previous info that hwere inconsistent or unpredictable responses are essential to the function of the
content, the user is always warned in advance of encountering them
- The tester verifies that subject to previous info wherever there are extreme changes in context, one of the following is always true: an easy-
to-find setting, that persists for the site visit, is provided fot he user to deactivate processes or features that cause extreme changes
in context, or extreme changes in context are identified before they occur so they can be prepared for the change
- The tester verifies that subject to previous info for all markup, the markup has: passed validity tests of the language, structural elements
and attributes are used as defined in the specification, accessbility features are used, and deprecated features are avoided
- The tester verifies that subject to previous info all accessibility conventions of the markup or programming language are always used
- The tester verifies that subject to previous info all relevant interfaces have been tested using a variety of assistive technologies (and
preferably real people with disabilities) to determine that those assistive technologies are always able to acccess all information
on the page or hidden within the page
- The tester verifies that subject to previous info all applicable Web resources include a list of the technologies users must have in order for
its content to work as intended
- The tool designer defines all markup implementations supported by the tool on a form
- The author identifies authoring actions supporting WCAG-compliant markup implementations of the tool
- The author identifies authoring actions supporting non WCAG-compliant markup implementations of the tool
- The tester verifies that every markup implementation on the WCAG list is "better" than every implementation
on the non-WCAG list in terms of prominence of location
- The tester verifies that every markup implementation on the WCAG list is "better" than every implementation
on the non-WCAG list in terms of position in layout
- The tester verifies that every markup implementation on the WCAG list is "better" than every implementation
on the non-WCAG list in terms of size of control
- The tester verifies that every markup implementation on the WCAG list is "better" than every implementation
on the non-WCAG list in terms of actions to activate
- Success Criteria 4.X: TBD
- Success Criteria 4.4: The mechanisms for accessibility prompting, checking, repair and
documentation must be similar to comparable mechanisms in terms of the following
characteristics:
- visual design (design metaphors, artistic sophistication, sizes, fonts, colors)
- operation (degree of automation, number of actions for activation)
- configurability (number and types of features)
(NOTE: This may be in AAA below?)
Test Plan
- Tool designer defines all mechanisms for accessibility prompting, checking, repair and documentation
supported by the tool on a form
- Tool designer defines all comparable mechanisms to the above supported by the tool on a form, and why they are
comparable
- Given this info, the tester (author) verifies that all mechanisms in the accessibility list have the same visual design as
all items in the comparable mechanism list, and enters the results on a form
- Given this info, the tester (author) verifies that all mechanisms in the top list above have the same operation as
all items in the comparable mechanism list, and enters the results on a form
- Given this info, the tester (author) verifies that all mechanisms in the top list above have the same configurabiity as
all items in the comparable mechanism list, and enters the results on a form
---------------------
- Success Criteria 2.5: All markup strings written automatically by the tool (i.e.,
not authored "by hand") must satisfy all of the Level 2 success criteria (WCAG2.0 10/27/03 draft). (RELATIVE PRIORITY):
Test Plan:
- The tool designer defines on a form how one automatically-generates markup strings using the tool
- Using this info, the author (tester) verifies each of the following items (or enters N/A as appropriate) on a form:
- The tester verifies that in such examples non-text content that cannot be expressed in words always has a text
equivalent for all aspects that can be expressed in words
- The tester verifies that in such examples a text document that merge all audio descriptions and captions
into a collated script is always provided.
- The tester verifies that in such examples captions and audio descriptiosn are always provided for all live broadcasts.
- The tester verifies that in such examples any information presented using color is always available without color and
without having to interpret markup.
- The tester verifies that in such examples all abbreviations and acronyms are clearly identified each time they occur if
they collide with a word in the standard language that would also logically appear in the same case
- The tester verifies that in such examples all symbols such as diacritic marks that are found in standard usage of the
natural language of the content, and that are necessary for unambiguous identification of words, are always present or anoth
standard mechanism for disambiguation is always provided.
- The tester verifies that in such examples all structural elements present have a different visual appearance or auditory
characteristic from each other and from body text.
- The tester verifies that in such examples all text that is presneted over a background color or grayscale has a mechanism
that allows the text to be presented in a fashion that has a "large" contrast between text and background color.
- The tester verifies that in such examples all audio content does not contain background sounds or the background sounds
are at least 20db lower than the foreground audio content.
- The tester verifies that in such examples wherever a choice between event handlers is available and supported, the more
abstract event is always used.
- The tester verifies that in such examples any blinking content can always be turned off.
- The tester verifies that in such examples any moving content can always be paused.
- The tester verifies that in such examples all animation or other content does not visibly or purposely flicker between 3 and 49Hz.
- The tester verifies that in such examples all content that might create a problem has been tested, and only pages with unavoidable
flicker remain and appropriate warnings along with a close alternative presentation have been provided for these pages.
- The tester verifies that in such examples in all documents greater than 50000 words or all sites larger than 50 perceived pages,
at least one of the following is provided: hierarchical structure markup, table of contents, and alternate display orders
- The tester verifies that in such examples users are always able to skip over large blocks of repetitive material, navigational bars
or other blocks of links that are greater than 7 when reading with a synthesized or navigating using keyboard.
- The tester verifies that in such examples if an error is detected, feedback is always provided to the user identifying the error.
- The tester verifies that in such examples all acronyms and abbreviations do not appear first in standard unabridged dictionaries
for the language, or are always defined the first time they appear, or are always available in a glossary on the site.
- The tester verifies that in such examples all content has been reviewed, taking into account the following strategies for
evaluating the complexity of content, applying as appropriate: familiarity of terms and language structure, reasonableness of length
and complexity of sentences, coherence of paragraphs (and sensibility in length), clarity of headings and linked text when read out of
context, accuracy and uniqueness of page titles, care in the use of all-capital letters where normal sentences case might increase
comprehension, inclusion of non-text content to supplement text for key pages or sections of the site where appropriate.
- The tester verifies that in such examples all key orientation adn navigational elements are generally found in one or two consistent
locations or their locations are always otherwise predictable
- The tester verifies that in such examples where inconsistent or unpredictable responses are essential to the function of the
content, the user is always warned in advance of encountering them
- The tester verifies that in such examples wherever there are extreme changes in context, one of the following is always true: an easy-
to-find setting, that persists for the site visit, is provided fot he user to deactivate processes or features that cause extreme changes
in context, or extreme changes in context are identified before they occur so they can be prepared for the change
- The tester verifies that in such examples for all markup, the markup has: passed validity tests of the language, structural elements
and attributes are used as defined in the specification, accessbility features are used, and deprecated features are avoided
- The tester verifies that in such examples all accessibility conventions of the markup or programming language are always used
- The tester verifies that in such examples all relevant interfaces have been tested using a variety of assistive technologies (and
preferably real people with disabilities) to determine that those assistive technologies are always able to acccess all information
on the page or hidden within the page
- The tester verifies that in such examples all applicable Web resources include a list of the technologies users must have in order for
its content to work as intended
- Success Criteria 2.6: Any web content (e.g., templates, clip art, multimedia objects,
scripts, applets, example pages, etc.) preferentially licensed (i.e., better terms
for users of tool then for other) for users of the tool, must satisfy all of the
Level 2 success criteria of WCAG2.0 (10/27/03 draft)(RELATIVE PRIORITY):
Test Plan:
- The tool designer defines all examples of preferentially-licensed content supported by the tool on a form
- The tool designer defines what preferential licensing means in the context of the tool, on a form
- Using this info, the author (tester) verifies each of the following items (or enters N/A as appropriate) on a form:
- The tester verifies that in such examples non-text content that cannot be expressed in words always has a text
equivalent for all aspects that can be expressed in words
- The tester verifies that in such examples a text document that merge all audio descriptions and captions
into a collated script is always provided.
- The tester verifies that in such examples captions and audio descriptiosn are always provided for all live broadcasts.
- The tester verifies that in such examples any information presented using color is always available without color and
without having to interpret markup.
- The tester verifies that in such examples all abbreviations and acronyms are clearly identified each time they occur if
they collide with a word in the standard language that would also logically appear in the same case
- The tester verifies that in such examples all symbols such as diacritic marks that are found in standard usage of the
natural language of the content, and that are necessary for unambiguous identification of words, are always present or anoth
standard mechanism for disambiguation is always provided.
- The tester verifies that in such examples all structural elements present have a different visual appearance or auditory
characteristic from each other and from body text.
- The tester verifies that in such examples all text that is presneted over a background color or grayscale has a mechanism
that allows the text to be presented in a fashion that has a "large" contrast between text and background color.
- The tester verifies that in such examples all audio content does not contain background sounds or the background sounds
are at least 20db lower than the foreground audio content.
- The tester verifies that in such examples wherever a choice between event handlers is available and supported, the more
abstract event is always used.
- The tester verifies that in such examples any blinking content can always be turned off.
- The tester verifies that in such examples any moving content can always be paused.
- The tester verifies that in such examples all animation or other content does not visibly or purposely flicker between 3 and 49Hz.
- The tester verifies that in such examples all content that might create a problem has been tested, and only pages with unavoidable
flicker remain and appropriate warnings along with a close alternative presentation have been provided for these pages.
- The tester verifies that in such examples in all documents greater than 50000 words or all sites larger than 50 perceived pages,
at least one of the following is provided: hierarchical structure markup, table of contents, and alternate display orders
- The tester verifies that in such examples users are always able to skip over large blocks of repetitive material, navigational bars
or other blocks of links that are greater than 7 when reading with a synthesized or navigating using keyboard.
- The tester verifies that in such examples if an error is detected, feedback is always provided to the user identifying the error.
- The tester verifies that in such examples all acronyms and abbreviations do not appear first in standard unabridged dictionaries
for the language, or are always defined the first time they appear, or are always available in a glossary on the site.
- The tester verifies that in such examples all content has been reviewed, taking into account the following strategies for
evaluating the complexity of content, applying as appropriate: familiarity of terms and language structure, reasonableness of length
and complexity of sentences, coherence of paragraphs (and sensibility in length), clarity of headings and linked text when read out of
context, accuracy and uniqueness of page titles, care in the use of all-capital letters where normal sentences case might increase
comprehension, inclusion of non-text content to supplement text for key pages or sections of the site where appropriate.
- The tester verifies that in such examples all key orientation adn navigational elements are generally found in one or two consistent
locations or their locations are always otherwise predictable
- The tester verifies that in such examples where inconsistent or unpredictable responses are essential to the function of the
content, the user is always warned in advance of encountering them
- The tester verifies that in such examples wherever there are extreme changes in context, one of the following is always true: an easy-
to-find setting, that persists for the site visit, is provided fot he user to deactivate processes or features that cause extreme changes
in context, or extreme changes in context are identified before they occur so they can be prepared for the change
- The tester verifies that in such examples for all markup, the markup has: passed validity tests of the language, structural elements
and attributes are used as defined in the specification, accessbility features are used, and deprecated features are avoided
- The tester verifies that in such examples all accessibility conventions of the markup or programming language are always used
- The tester verifies that in such examples all relevant interfaces have been tested using a variety of assistive technologies (and
preferably real people with disabilities) to determine that those assistive technologies are always able to acccess all information
on the page or hidden within the page
- The tester verifies that in such examples all applicable Web resources include a list of the technologies users must have in order for
its content to work as intended
- Success Criteria 3.1:
- When the actions of the author risk creating accessibility problems according to any of the
Level 2 success criteria of WCAG2.0 (10/27/03 draft),
the tool must intervene to introduce the appropriate accessible
authoring practice. This intervention may proceed according to a user-configurable schedule.
- The intervention must occur at least once before ocmpletion of authoring (e.g., final save,
publishing, etc.)(RELATIVE PRIORITY)
Test Plan:
- The tool designer defines all author-initiated actions supported by the tool on a form
- The tool designer describes how accessibility problems are detected using the tool on a form
- The tool designer describes how the tool intervenes given the preceding information, on a form
- Given this info, the author (tester) verifies each following item (or enters N/A as appropriate) on a form:
- The tester verifies that if as a result of such actions non-text content that cannot be expressed in words does not always have a text
equivalent for all aspects that can be expressed in words, the tool always produces an acceptable alternative
- The tester verifies that if as a result of such actions a text document that merges all audio descriptions and captions
into a collated script is not always provided, the tool always produces an acceptable alternative.
- The tester verifies that if as a result of such actions captions and audio descriptiosn are not always provided for all live broadcasts,
the tool always produces an acceptable alternative.
- The tester verifies that if as a result of such actions any information presented using color is not always available without color and
without having to interpret markup, the tool always produces an acceptable alternative.
- The tester verifies that if as a result of such actions all abbreviations and acronyms are not clearly identified each time they occur if
they collide with a word in the standard language that would also logically appear in the same case, the tool always produces an acceptable alternative
- The tester verifies that if as a result of such actions all symbols such as diacritic marks that are found in standard usage of the
natural language of the content, and that are necessary for unambiguous identification of words, are not always present or another
standard mechanism for disambiguation is not always provided, the tool always produces an acceptable alternative.
- The tester verifies that if as a result of such actions all structural elements present do not have a different visual appearance or auditory
characteristic from each other and from body text, the tool always produces an acceptable alternative.
- The tester verifies that if as a result of such actions all text that is presented over a background color or grayscale does not have a mechanism
that allows the text to be presented in a fashion that has a "large" contrast between text and background color, the tool always produces an acceptable alternative
- The tester verifies that if as a result of such actions all audio content contains background sounds or the background sounds
are not at least 20db lower than the foreground audio content, the tool always produces an acceptable alternative.
- The tester verifies that if as a result of such actions wherever a choice between event handlers is available and supported, the more
abstract event is not always used, the tool always produces an acceptable alternative.
- The tester verifies that if as a result of such actions any blinking content can not always be turned off, the tool always produces an acceptable alternative.
- The tester verifies that if as a result of such actions any moving content can not always be paused, the tool always produces an acceptable alternative.
- The tester verifies that if as a result of such actions all animation or other content visibly or purposely flickers between 3 and 49Hz, the tool always produces an acceptable alternative.
- The tester verifies that if as a result of such actions all content that might create a problem has not been tested, and only pages with unavoidable
flicker remain and appropriate warnings along with a close alternative presentation have not been provided for these pages, the tool always produces an acceptable alternative.
- The tester verifies that if as a result of such actions in all documents greater than 50000 words or all sites larger than 50 perceived pages,
all of the following are not provided: hierarchical structure markup, table of contents, and alternate display orders, the tool always produces an acceptable alternative
- The tester verifies that if as a result of such actions users are not always able to skip over large blocks of repetitive material, navigational bars
or other blocks of links that are greater than 7 when reading with a synthesized or navigating using keyboard, the tool always produces an acceptable alternative.
- The tester verifies that if as a result of such actions if an error is detected, feedback is not always provided to the user identifying the error, the tool always produces an acceptable alternative.
- The tester verifies that if as a result of such actions all acronyms and abbreviations appear first in standard unabridged dictionaries
for the language, or are not always defined the first time they appear, or are not always available in a glossary on the site, the tool always produces an acceptable alternative.
- The tester verifies that if as a result of such actions all content has not been reviewed, taking into account the following strategies for
evaluating the complexity of content, applying as appropriate: familiarity of terms and language structure, reasonableness of length
and complexity of sentences, coherence of paragraphs (and sensibility in length), clarity of headings and linked text when read out of
context, accuracy and uniqueness of page titles, care in the use of all-capital letters where normal sentences case might increase, the tool always produces an acceptable alternative
comprehension, inclusion of non-text content to supplement text for key pages or sections of the site where appropriate.
- The tester verifies that if as a result of such actions all key orientation adn navigational elements arenot generally found in one or two consistent
locations or their locations are always otherwise predictable, the tool always produces an acceptable alternative
- The tester verifies that if as a result of such actions where inconsistent or unpredictable responses are essential to the function of the
content, the user is not always warned in advance of encountering them, the tool always produces an acceptable alternative
- The tester verifies that if as a result of such actions wherever there are extreme changes in context, none of the following are true: an easy-
to-find setting, that persists for the site visit, is provided fot he user to deactivate processes or features that cause extreme changes
in context, or extreme changes in context are identified before they occur so they can be prepared for the change, the tool always produces an acceptable alternative
- The tester verifies that if as a result of such actions for all markup, the markup has not: passed validity tests of the language, structural elements
and attributes are used as defined in the specification, accessbility features are used, and deprecated features are avoided, the tool always produces an acceptable alternative
- The tester verifies that if as a result of such actions all accessibility conventions of the markup or programming language are not always used, the tool always produces an acceptable alternative
- The tester verifies that if as a result of such actions all relevant interfaces have not been tested using a variety of assistive technologies (and
preferably real people with disabilities) to determine that those assistive technologies are always able to acccess all information
on the page or hidden within the page, the tool always produces an acceptable alternative
- The tester verifies that if as a result of such actions all applicable Web resources do not include a list of the technologies users must have in order for
its content to work as intended, the tool always produces an acceptable alternative
- Success Criteria 3.2: The tool must provide a check (automated check, semi-automated check or
manual check) for detecting violations of each Level 2 success criteria of WCAG2 (10/27/03 draft)(RELATIVE PRIORITY)
Test Plan:
- The tool designer defines how checks (for detecting accessibility violations) are provided by the tool on a form
- The tool designer describes each check (for detecting accessibility violations), and what kind it is, on a form
- Given this info, the author (tester) verifies each following item (or enters N/A as appropriate) on a form:
- The tester verifies that if non-text content that cannot be expressed in words does not always have a text
equivalent for all aspects that can be expressed in words, the tool always produces an acceptable check
- The tester verifies that if a text document that merges all audio descriptions and captions
into a collated script is not always provided, the tool always produces an acceptable check.
- The tester verifies that if captions and audio descriptions are not always provided for all live broadcasts,
the tool always produces an acceptable check.
- The tester verifies that if any information presented using color is not always available without color and
without having to interpret markup, the tool always produces an acceptable check.
- The tester verifies that if all abbreviations and acronyms are not clearly identified each time they occur if
they collide with a word in the standard language that would also logically appear in the same case, the tool always produces an acceptable check
- The tester verifies that if all symbols such as diacritic marks that are found in standard usage of the
natural language of the content, and that are necessary for unambiguous identification of words, are not always present or another
standard mechanism for disambiguation is not always provided, the tool always produces an acceptable check.
- The tester verifies that if all structural elements present do not have a different visual appearance or auditory
characteristic from each other and from body text, the tool always produces an acceptable check.
- The tester verifies that if all text that is presented over a background color or grayscale does not have a mechanism
that allows the text to be presented in a fashion that has a "large" contrast between text and background color, the tool always produces an acceptable check
- The tester verifies that if all audio content contains background sounds or the background sounds
are not at least 20db lower than the foreground audio content, the tool always produces an acceptable check.
- The tester verifies that if wherever a choice between event handlers is available and supported, the more
abstract event is not always used, the tool always produces an acceptable check.
- The tester verifies that if any blinking content can not always be turned off, the tool always produces an acceptable check.
- The tester verifies that if any moving content can not always be paused, the tool always produces an acceptable check.
- The tester verifies that if all animation or other content visibly or purposely flickers between 3 and 49Hz, the tool always produces an acceptable check.
- The tester verifies that if all content that might create a problem has not been tested, and only pages with unavoidable
flicker remain and appropriate warnings along with a close alternative presentation have not been provided for these pages, the tool always produces an acceptable check.
- The tester verifies that if in all documents greater than 50000 words or all sites larger than 50 perceived pages,
all of the following are not provided: hierarchical structure markup, table of contents, and alternate display orders, the tool always produces an acceptable check
- The tester verifies that if users are not always able to skip over large blocks of repetitive material, navigational bars
or other blocks of links that are greater than 7 when reading with a synthesized or navigating using keyboard, the tool always produces an acceptable check.
- The tester verifies that if an error is detected and feedback is not always provided to the user identifying the error, the tool always produces an acceptable check.
- The tester verifies that if all acronyms and abbreviations appear first in standard unabridged dictionaries
for the language, or are not always defined the first time they appear, or are not always available in a glossary on the site, the tool always produces an acceptable check.
- The tester verifies that if all content has not been reviewed, taking into account the following strategies for
evaluating the complexity of content, applying as appropriate: familiarity of terms and language structure, reasonableness of length
and complexity of sentences, coherence of paragraphs (and sensibility in length), clarity of headings and linked text when read out of
context, accuracy and uniqueness of page titles, care in the use of all-capital letters where normal sentences case might increase,
comprehension, inclusion of non-text content to supplement text for key pages or sections of the site where appropriate, the tool always produces an acceptable check
- The tester verifies that if all key orientation and navigational elements are not generally found in one or two consistent
locations or their locations are always not otherwise predictable, the tool always produces an acceptable check
- The tester verifies that if where inconsistent or unpredictable responses are essential to the function of the
content and the user is not always warned in advance of encountering them, the tool always produces an acceptable check
- The tester verifies that if wherever there are extreme changes in context, none of the following are true: an easy-
to-find setting, that persists for the site visit, is provided fot he user to deactivate processes or features that cause extreme changes
in context, or extreme changes in context are identified before they occur so they can be prepared for the change, the tool always produces an acceptable check
- The tester verifies that if for all markup, the markup has not: passed validity tests of the language, structural elements
and attributes are used as defined in the specification, accessbility features are used, and deprecated features are avoided, the tool always produces an acceptable check
- The tester verifies that if all accessibility conventions of the markup or programming language are not always used, the tool always produces an acceptable check
- The tester verifies that if all relevant interfaces have not been tested using a variety of assistive technologies (and
preferably real people with disabilities) to determine that those assistive technologies are always able to acccess all information
on the page or hidden within the page, the tool always produces an acceptable check
- The tester verifies that if all applicable Web resources do not include a list of the technologies users must have in order for
its content to work as intended, the tool always produces an acceptable check
- Success Criteria 3.3: The tool must provide a repair (automated repair, semi-automated repair or
manual repair) for correcting violations of each Level 2 success criteria of WCAG2.0 (10/28/03 draft)(RELATIVE PRIORITY):
Test Plan:
- The tool designer defines how repairs (for detecting accessibility violations) are provided by the tool on a form
- The tool designer describes each repair (for detecting accessibility violations), and what kind it is, on a form
- Given this info, the author (tester) verifies each following item (or enters N/A as appropriate) on a form:
- The tester verifies that if non-text content that cannot be expressed in words does not always have a text
equivalent for all aspects that can be expressed in words, the tool always produces an acceptable repair
- The tester verifies that if a text document that merges all audio descriptions and captions
into a collated script is not always provided, the tool always produces an acceptable repair.
- The tester verifies that if captions and audio descriptions are not always provided for all live broadcasts,
the tool always produces an acceptable repair.
- The tester verifies that if any information presented using color is not always available without color and
without having to interpret markup, the tool always produces an acceptable repair.
- The tester verifies that if all abbreviations and acronyms are not clearly identified each time they occur if
they collide with a word in the standard language that would also logically appear in the same case, the tool always produces an acceptable repair
- The tester verifies that if all symbols such as diacritic marks that are found in standard usage of the
natural language of the content, and that are necessary for unambiguous identification of words, are not always present or another
standard mechanism for disambiguation is not always provided, the tool always produces an acceptable repair.
- The tester verifies that if all structural elements present do not have a different visual appearance or auditory
characteristic from each other and from body text, the tool always produces an acceptable repair.
- The tester verifies that if all text that is presented over a background color or grayscale does not have a mechanism
that allows the text to be presented in a fashion that has a "large" contrast between text and background color, the tool always produces an acceptable repair
- The tester verifies that if all audio content contains background sounds or the background sounds
are not at least 20db lower than the foreground audio content, the tool always produces an acceptable repair.
- The tester verifies that if wherever a choice between event handlers is available and supported, the more
abstract event is not always used, the tool always produces an acceptable repair.
- The tester verifies that if any blinking content can not always be turned off, the tool always produces an acceptable repair.
- The tester verifies that if any moving content can not always be paused, the tool always produces an acceptable repair.
- The tester verifies that if all animation or other content visibly or purposely flickers between 3 and 49Hz, the tool always produces an acceptable repair.
- The tester verifies that if all content that might create a problem has not been tested, and only pages with unavoidable
flicker remain and appropriate warnings along with a close alternative presentation have not been provided for these pages, the tool always produces an acceptable repair.
- The tester verifies that if in all documents greater than 50000 words or all sites larger than 50 perceived pages,
all of the following are not provided: hierarchical structure markup, table of contents, and alternate display orders, the tool always produces an acceptable repair
- The tester verifies that if users are not always able to skip over large blocks of repetitive material, navigational bars
or other blocks of links that are greater than 7 when reading with a synthesized or navigating using keyboard, the tool always produces an acceptable repair.
- The tester verifies that if an error is detected and feedback is not always provided to the user identifying the error, the tool always produces an acceptable repair.
- The tester verifies that if all acronyms and abbreviations appear first in standard unabridged dictionaries
for the language, or are not always defined the first time they appear, or are not always available in a glossary on the site, the tool always produces an acceptable repair.
- The tester verifies that if all content has not been reviewed, taking into account the following strategies for
evaluating the complexity of content, applying as appropriate: familiarity of terms and language structure, reasonableness of length
and complexity of sentences, coherence of paragraphs (and sensibility in length), clarity of headings and linked text when read out of
context, accuracy and uniqueness of page titles, care in the use of all-capital letters where normal sentences case might increase,
comprehension, inclusion of non-text content to supplement text for key pages or sections of the site where appropriate, the tool always produces an acceptable repair
- The tester verifies that if all key orientation and navigational elements are not generally found in one or two consistent
locations or their locations are always not otherwise predictable, the tool always produces an acceptable repair
- The tester verifies that if where inconsistent or unpredictable responses are essential to the function of the
content and the user is not always warned in advance of encountering them, the tool always produces an acceptable repair
- The tester verifies that if wherever there are extreme changes in context, none of the following are true: an easy-
to-find setting, that persists for the site visit, is provided fot he user to deactivate processes or features that cause extreme changes
in context, or extreme changes in context are identified before they occur so they can be prepared for the change, the tool always produces an acceptable repair
- The tester verifies that if for all markup, the markup has not: passed validity tests of the language, structural elements
and attributes are used as defined in the specification, accessbility features are used, and deprecated features are avoided, the tool always produces an acceptable repair
- The tester verifies that if all accessibility conventions of the markup or programming language are not always used, the tool always produces an acceptable repair
- The tester verifies that if all relevant interfaces have not been tested using a variety of assistive technologies (and
preferably real people with disabilities) to determine that those assistive technologies are always able to acccess all information
on the page or hidden within the page, the tool always produces an acceptable repair
- The tester verifies that if all applicable Web resources do not include a list of the technologies users must have in order for
its content to work as intended, the tool always produces an acceptable repair
----------------------------------------------------------------------------------------------------
Conformance Level AAA (after doing A and AA)
- Success Criteria 1.1: The authoring interface must pass recommended elements of the Software
Accessibility Guidelines testing criteria
Test Plan
- The tool designer describes the authoring interface(s) supported by the tool on a form
- The author (tester) describes which software accessibility guidelines are being used for verification, on a form
- The author (tester) describes which elements of these guidelines are recommended, on a form
- Given the above, for each authoring interface, tester verifies that all elements of the interface specified above pass each recommended element
of the applicable Software Accessibility Guidelines testing criteria, and enters the results on a form
- Success Criteria 3.5: When non-text objects have been previously inserted using the tool, the
tool must suggest any previously authored textual equivalents for that non-text object
Test Plan
- tool designer describes how non-text objects can be inserted using the tool on a form
- tool designer details the kinds of non-text objects that can be inserted (and which non-text objects have been inserted previously
as well as when?) on a form
- tool designer defines how the tool generates previously authored textual equivalents (and how the tool
associates these with specific non-text objects) on a form
- tool designer describes how the tool prompts the author as mentioned previously, on a form
- Given the previous information, author (tester) verifies that in fact the tool successfully prompts the author as appropriate when
there is a "match"; the results are entered on a form
- Given the previous information, if author (tester) accepts what the prompt suggests, the author verifies that the tool successfully
inserts the proper text equivalent for the correct non-text object every time; the results are entered on a form
- Given the previous information, if author (tester) declines what the prompt suggests, the author verifies that the tool does not
insert a text equivalent in any case; the results are entered on a form
- Success Criteria 3.6: The tool must provide the author with an option to view a listing of all
current accessibility problems.
Test Plan
- The tool designer defines how the tool detects accessibility problems, what kind of accessibility problems are detected,
and how they are reported, on a form
- The tool designer describes how a listing of accessibility problems is presented to the author, on a form
- Given the previous info, the tester (author) verifies that the tool successfully informs the
author of an option to list all of known accessibility problems; results are entered on a form
- Given the previous info, if the author accepts the option, the author verifies that the entire list is made available to the author every time;
results are entered on the form
- Given the previous info, the author verifies that every entry in the list in fact is an accessibility problem (and links back
to the actual problem); results are entered on a form
- Given the previous info, if the author refuses the option, the author verifies that the list is not made available to the author;
results are entered on a form
- Success Criteria 3.9:
- The documentation must contain suggested content creation workflow descriptions that
include how and when to use the accessibility-related features of the tool
- For tools that lack a particular accessibility-related feature, the workflow description
must include a workaround for that feature
Test Plan
- tool designer describes all accessibility-related features of the tool (and how to use each) on a form
- tool designer describes documentation of tool on a form
- tool designer describes what content workflow descriptions are supported by the tool, how they are suggested, and
how they are integrated into the documentation, on a form
- Given the previous info, author (tester) verifies that documentation for tool does in fact contain
content creation workflow description; results are entered on a form
- Given the previous info, author (tester) verifies that each such content creation workflow description
does in fact describe how and when to use each accessbility-related feature; results are entered on a form
- tool designer defines all accessibility-related features missing from the tool on a form
- Given the previous info, author (tester) verifies that for each such missing feature a content creation
workflow description in the documentation in fact contains a successful workaround for that feature
; results are entered on a form
- Given the previouw info, author (tester) verifies that each such workaround does in fact work?;
results are entered on a form?
------------------
- Success Criteria 2.5: All markup strings written automatically by the tool (i.e.,
not authored "by hand") must conform to at least one of the Level 3 success criteria of WCAG2 (10/27/03 draft)(RELATIVE PRIORITY):
Test Plan
- tool designer explains how markup strings are written automatically by the tool (on form)
- tool designer defines which kinds of markup strings can be written by the tool (on a form)
- Given this information, author (tester) enters on form which of following items are satisfied (NOTE: must be at
least one satisfied):
- Tester verifies that the presentation of the markup does not require the user to read captions and the visual
presentation simultaneously in order to understand the content
- Tester verifies that the structural emphases of the markup are chosen to be distinct on different major visual
display types
- Tester verifies that the content of the markup is constructed such that users can control the presentation of
structural elements or the structure of the markup can be varied through alternate presentation formats
- Tester verifies that for markup when text content is presented over a background image or pattern, the text is
easily readable when the page is viewed in 256 grayscale
- Tester verifies that for markup when text content is presented over a background image or pattern, the text is
easily readable in default presentation mode
- Tester verifies that for markup there are no time limits as a part of a competitive activity
- Tester verifies that for markup the content has been reviewed, taking into account the following strategies for
facilitating orientation and movement, applying as appropriate: breaking up text into logical paragraphs, providing
hierarchical sections and titles, particularly for longer documents, revealing important non-hierarchical relationships,
and dividing very large works into sections/chapters with logical labels
- Tester verifies that for markup information is provided that would allow an assistive technology to
determine at least one logical, linear reading order
- Tester verifies that for markup diagrams are constructed in a fashion so that they have structure that can be accessed by
the user
- Tester verifies that for markup where possible, logical tab order has been created
- Tester verifies that for markup where possible, the user is allowed to select from a list of optiosn as well as
to generate input text directly
- Tester verifies that for markup errors are identified specifically and suggestions for correction are provided
where possible
- Tester verifies that for markup checks for misspelled words are applied and correct spellings are suggested when text
entry is required
- Tester verifies that for markup where consequences are significant and time-response is not important, one of the following
is true: (a) actions are reversible, (b) where not reversible, actions are checked for errors in advance, (c) where not reversible,
and not checkable, a confirmation is asked before acceptance
- Tester verifies that for markup a list is provided on the home page of URIs to cascading dictionaries that can or should be used
to define abbreviations or acronyms
- Tester verifies that for markup the content has been reviewed, taking into account the following strategies for determining the
definition of abbreviations and acronyms, applying them as appropriate: (a) provide a definition or link (with the first occurrence)
of phrases, words, acronyms, and abbreviations specific to a particular community, (b) provide a summary for relationships that may
not be obvious from analyzing the structure of a table but that may be apparent in a visual rendering of the table, (c) if contracted
forms of words are used such that they are ambiguous, provide semantic markup to make words unique and interpretable
- Tester verifies that for markup the content has been reviewed, taking into account the strategies for evaluating the complexity of
content, applying as appropriate.
- Tester verifies that for markup a user can select a different location for navigation elements in the layout of the page
- Tester verifies that for markup the content has been reviewed, taking into account common ideas fro making content consistent and predictable,
applying as appropriate
- Tester verifies that for markup a list of technologies and features, support for which is required in order for the content to be operable,
has been determined and is documented in metadata and/or a policy statement associated with the content,
- Tester verifies that for markup technologies and features on the required list are available in at least two independently-developed implementations
- Success Criteria 2.6: Any web content (e.g., templates, clip art, multimedia objects,
scripts, applets, example pages, etc.) preferentially licensed (i.e., better terms
for users of tool then for other) for users of the tool, must satisfy at least one of the
Level 3 success criteria of WCAG2.0 (10/27/03 draft)
(RELATIVE PRIORITY):
Test Plan
- tool designer defines all kinds of web content which are preferentially licensed associated with tool (on form)
- tool designer defines what preferential licensing means in context of the tool (on form)
- Given this info, author (tester) verifies at least one of following items, and enters results on form:
- Tester verifies that the presentation of such content does not require the user to read captions and the visual
presentation simultaneously in order to understand the content
- Tester verifies that the structural emphases of such content are chosen to be distinct on different major visual
display types
- Tester verifies that the content of such content is constructed such that users can control the presentation of
structural elements or the structure of such content can be varied through alternate presentation formats
- Tester verifies that for such content when text content is presented over a background image or pattern, the text is
easily readable when the page is viewed in 256 grayscale
- Tester verifies that for such content when text content is presented over a background image or pattern, the text is
easily readable in default presentation mode
- Tester verifies that for such content there are no time limits as a part of a competitive activity
- Tester verifies that for such content the content has been reviewed, taking into account the following strategies for
facilitating orientation and movement, applying as appropriate: breaking up text into logical paragraphs, providing
hierarchical sections and titles, particularly for longer documents, revealing important non-hierarchical relationships,
and dividing very large works into sections/chapters with logical labels
- Tester verifies that for such content information is provided that would allow an assistive technology to
determine at least one logical, linear reading order
- Tester verifies that for such content diagrams are constructed in a fashion so that they have structure that can be accessed by
the user
- Tester verifies that for such content where possible, logical tab order has been created
- Tester verifies that for such content where possible, the user is allowed to select from a list of optiosn as well as
to generate input text directly
- Tester verifies that for such content errors are identified specifically and suggestions for correction are provided
where possible
- Tester verifies that for such content checks for misspelled words are applied and correct spellings are suggested when text
entry is required
- Tester verifies that for such contnet where consequences are significant and time-response is not important, one of the following
is true: (a) actions are reversible, (b) where not reversible, actions are checked for errors in advance, (c) where not reversible,
and not checkable, a confirmation is asked before acceptance
- Tester verifies that for such content a list is provided on the home page of URIs to cascading dictionaries that can or should be used
to define abbreviations or acronyms
- Tester verifies that for such content the content has been reviewed, taking into account the following strategies for determining the
definition of abbreviations and acronyms, applying them as appropriate: (a) provide a definition or link (with the first occurrence)
of phrases, words, acronyms, and abbreviations specific to a particular community, (b) provide a summary for relationships that may
not be obvious from analyzing the structure of a table but that may be apparent in a visual rendering of the table, (c) if contracted
forms of words are used such that they are ambiguous, provide semantic markup to make words unique and interpretable
- Tester verifies that for such content the content has been reviewed, taking into account the strategies for evaluating the complexity of
content, applying as appropriate.
- Tester verifies that for such content a user can select a different location for navigation elements in the layout of the page
- Tester verifies that for such content the content has been reviewed, taking into account common ideas fro making content consistent and predictable,
applying as appropriate
- Tester verifies that for such content a list of technologies and features, support for which is required in order for the content to be operable,
has been determined and is documented in metadata and/or a policy statement associated with the content,
- Tester verifies that for such content technologies and features on the required list are available in at least two independently-developed implementations
- Success Criteria 3.1:
- When the actions of the author risk creating accessibility problems according to the
Level 3 success criteria of WCAG2.0 (10/27/03 draft)
the tool must intervene to introduce the appropriate accessible
authoring practice. This intervention may proceed according to a user-configurable schedule.
- The intervention must occur at least once before ocmpletion of authoring (e.g., final save,
publishing, etc.)
(RELATIVE PRIORITY):
Test Plan
- tool designer describes kinds of accessibility problems that can be detected using tool (on form)
- tool designer describes how tool intervenes and notifies author (on form)
- tool designer describes how author can configure schedule (on form)
- author (tester) configures intervention schedule using previous info; note is made on form
- Given this info, author (tester) verifies each following item (or N/A as appropriate) and enters results on form (x is a small number,
and is assumed to be before completion of authoring):
- Tester verifies that if it is determined that the presentation requires the user to read captions and the visual
presentation simultaneously in order to understand the content, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that the structural emphases are not chosen to be distinct on different major visual
display types, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that the content is not constructed such that users can control the presentation of
structural elements or the structure of such content can not be varied through alternate presentation formats, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that when text content is presented over a background image or pattern, the text is
not easily readable when the page is viewed in 256 grayscale, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that when text content is presented over a background image or pattern, the text is
not easily readable in default presentation mode, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that there are time limits as a part of a competitive activity, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that the content has not been reviewed, taking into account the following strategies for
facilitating orientation and movement, applying as appropriate: breaking up text into logical paragraphs, providing
hierarchical sections and titles, particularly for longer documents, revealing important non-hierarchical relationships,
and dividing very large works into sections/chapters with logical labels, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that information is not provided that would allow an assistive technology to
determine at least one logical, linear reading order, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that diagrams are not constructed in a fashion so that they have structure that can be accessed by
the user, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that where possible, logical tab order has not been created, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that where possible, the user is not allowed to select from a list of options as well as
to generate input text directly, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that errors are not identified specifically and suggestions for correction are not provided
where possible, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that checks for misspelled words are not applied and correct spellings are not suggested when text
entry is required, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that where consequences are significant and time-response is not important, none of the following
is true: (a) actions are reversible, (b) where not reversible, actions are checked for errors in advance, (c) where not reversible,
and not checkable, a confirmation is asked before acceptance, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that a list is not provided on the home page of URIs to cascading dictionaries that can or should be used
to define abbreviations or acronyms, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that the content has not been reviewed, taking into account the following strategies for determining the
definition of abbreviations and acronyms, applying them as appropriate: (a) provide a definition or link (with the first occurrence)
of phrases, words, acronyms, and abbreviations specific to a particular community, (b) provide a summary for relationships that may
not be obvious from analyzing the structure of a table but that may be apparent in a visual rendering of the table, (c) if contracted
forms of words are used such that they are ambiguous, provide semantic markup to make words unique and interpretable, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that the content has not been reviewed, taking into account the strategies for evaluating the complexity of
content, applying as appropriate, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that a user can not select a different location for navigation elements in the layout of the page, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that the content has not been reviewed, taking into account common ideas fro making content consistent and predictable,
applying as appropriate, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that a list of technologies and features, support for which is required in order for the content to be operable,
has not been determined and is not documented in metadata and/or a policy statement associated with the content, applying as appropriate, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that technologies and features on the required list are not available in at least two independently-developed implementations,
the tool successfully intervenes within x seconds of the determination.
- Success Criteria 3.2: The tool must provide a check (automated check, semi-automated check or
manual check) for detecting violations of each Level 3 success criteria of WCAG2.0 (10/27/03 draft)(RELATIVE PRIORITY):
Test Plan
- The tool designer defines how checks (for detecting accessibility violations) are provided by the tool on a form
- The tool designer describes each check (for detecting accessibility violations), and what kind it is, on a form
- Given this info, the author (tester) verifies each following item (or enters N/A as appropriate) on a form:
- Tester verifies that if it is determined that the presentation requires the user to read captions and the visual
presentation simultaneously in order to understand the content, the tool successfully provides a check within x seconds of the determination.
- Tester verifies that if it is determined that the structural emphases are not chosen to be distinct on different major visual
display types, the tool successfully provides a check within x seconds of the determination.
- Tester verifies that if it is determined that the content is not constructed such that users can control the presentation of
structural elements or the structure of such content can not be varied through alternate presentation formats, the tool successfully provides a check within x seconds of the determination.
- Tester verifies that if it is determined that when text content is presented over a background image or pattern, the text is
not easily readable when the page is viewed in 256 grayscale, the tool successfully provides a check within x seconds of the determination.
- Tester verifies that if it is determined that when text content is presented over a background image or pattern, the text is
not easily readable in default presentation mode, the tool successfully provides a check within x seconds of the determination.
- Tester verifies that if it is determined that there are time limits as a part of a competitive activity, the tool successfully provides a check within x seconds of the determination.
- Tester verifies that if it is determined that the content has not been reviewed, taking into account the following strategies for
facilitating orientation and movement, applying as appropriate: breaking up text into logical paragraphs, providing
hierarchical sections and titles, particularly for longer documents, revealing important non-hierarchical relationships,
and dividing very large works into sections/chapters with logical labels, the tool successfully provides a check within x seconds of the determination.
- Tester verifies that if it is determined that information is not provided that would allow an assistive technology to
determine at least one logical, linear reading order, the tool successfully provides a check within x seconds of the determination.
- Tester verifies that if it is determined that diagrams are not constructed in a fashion so that they have structure that can be accessed by
the user, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that where possible, logical tab order has not been created, the tool successfully provides a check within x seconds of the determination.
- Tester verifies that if it is determined that where possible, the user is not allowed to select from a list of options as well as
to generate input text directly, the tool successfully provides a check within x seconds of the determination.
- Tester verifies that if it is determined that errors are not identified specifically and suggestions for correction are not provided
where possible, the tool successfully provides a check within x seconds of the determination.
- Tester verifies that if it is determined that checks for misspelled words are not applied and correct spellings are not suggested when text
entry is required, the tool successfully provides a check within x seconds of the determination.
- Tester verifies that if it is determined that where consequences are significant and time-response is not important, none of the following
is true: (a) actions are reversible, (b) where not reversible, actions are checked for errors in advance, (c) where not reversible,
and not checkable, a confirmation is asked before acceptance, the tool successfully provides a check within x seconds of the determination.
- Tester verifies that if it is determined that a list is not provided on the home page of URIs to cascading dictionaries that can or should be used
to define abbreviations or acronyms, the tool successfully provides a check within x seconds of the determination.
- Tester verifies that if it is determined that the content has not been reviewed, taking into account the following strategies for determining the
definition of abbreviations and acronyms, applying them as appropriate: (a) provide a definition or link (with the first occurrence)
of phrases, words, acronyms, and abbreviations specific to a particular community, (b) provide a summary for relationships that may
not be obvious from analyzing the structure of a table but that may be apparent in a visual rendering of the table, (c) if contracted
forms of words are used such that they are ambiguous, provide semantic markup to make words unique and interpretable, the tool successfully provides a check within x seconds of the determination.
- Tester verifies that if it is determined that the content has not been reviewed, taking into account the strategies for evaluating the complexity of
content, applying as appropriate, the tool successfully provides a check within x seconds of the determination.
- Tester verifies that if it is determined that a user can not select a different location for navigation elements in the layout of the page, the tool successfully provides a check within x seconds of the determination.
- Tester verifies that if it is determined that the content has not been reviewed, taking into account common ideas from making content consistent and predictable,
applying as appropriate, the tool successfully provides a check within x seconds of the determination.
- Tester verifies that if it is determined that a list of technologies and features, support for which is required in order for the content to be operable,
has not been determined and is not documented in metadata and/or a policy statement associated with the content, applying as appropriate, the tool successfully provides a check with x seconds of the determination.
- Tester verifies that if it is determined that technologies and features on the required list are not available in at least two independently-developed implementations,
the tool successfully provides a check within x seconds of the determination.
- Success Criteria 3.3: The tool must provide a repair (automated repair, semi-automated repair or
manual repair) for correcting violations of each Level 3 requirement of WCAG2.0 (10/27/03 draft)(RELATIVE PRIORITY):
Test Plan
- The tool designer defines how repairs (for detecting accessibility violations) are provided by the tool on a form
- The tool designer describes each repair (for detecting accessibility violations), and what kind it is, on a form
- Given this info, the author (tester) verifies each following item (or enters N/A as appropriate) on a form:
- Tester verifies that if it is determined that the presentation requires the user to read captions and the visual
presentation simultaneously in order to understand the content, the tool successfully provides a repair within x seconds of the determination.
- Tester verifies that if it is determined that the structural emphases are not chosen to be distinct on different major visual
display types, the tool successfully provides a repair within x seconds of the determination.
- Tester verifies that if it is determined that the content is not constructed such that users can control the presentation of
structural elements or the structure of such content can not be varied through alternate presentation formats, the tool successfully provides a repair within x seconds of the determination.
- Tester verifies that if it is determined that when text content is presented over a background image or pattern, the text is
not easily readable when the page is viewed in 256 grayscale, the tool successfully provides a repair within x seconds of the determination.
- Tester verifies that if it is determined that when text content is presented over a background image or pattern, the text is
not easily readable in default presentation mode, the tool successfully provides a repair within x seconds of the determination.
- Tester verifies that if it is determined that there are time limits as a part of a competitive activity, the tool successfully provides a repair within x seconds of the determination.
- Tester verifies that if it is determined that the content has not been reviewed, taking into account the following strategies for
facilitating orientation and movement, applying as appropriate: breaking up text into logical paragraphs, providing
hierarchical sections and titles, particularly for longer documents, revealing important non-hierarchical relationships,
and dividing very large works into sections/chapters with logical labels, the tool successfully provides a repair within x seconds of the determination.
- Tester verifies that if it is determined that information is not provided that would allow an assistive technology to
determine at least one logical, linear reading order, the tool successfully provides a repair within x seconds of the determination.
- Tester verifies that if it is determined that diagrams are not constructed in a fashion so that they have structure that can be accessed by
the user, the tool successfully intervenes within x seconds of the determination.
- Tester verifies that if it is determined that where possible, logical tab order has not been created, the tool successfully provides a repair within x seconds of the determination.
- Tester verifies that if it is determined that where possible, the user is not allowed to select from a list of options as well as
to generate input text directly, the tool successfully provides a repair within x seconds of the determination.
- Tester verifies that if it is determined that errors are not identified specifically and suggestions for correction are not provided
where possible, the tool successfully provides a repair within x seconds of the determination.
- Tester verifies that if it is determined that checks for misspelled words are not applied and correct spellings are not suggested when text
entry is required, the tool successfully provides a repair within x seconds of the determination.
- Tester verifies that if it is determined that where consequences are significant and time-response is not important, none of the following
is true: (a) actions are reversible, (b) where not reversible, actions are checked for errors in advance, (c) where not reversible,
and not checkable, a confirmation is asked before acceptance, the tool successfully provides a repair within x seconds of the determination.
- Tester verifies that if it is determined that a list is not provided on the home page of URIs to cascading dictionaries that can or should be used
to define abbreviations or acronyms, the tool successfully provides a repair within x seconds of the determination.
- Tester verifies that if it is determined that the content has not been reviewed, taking into account the following strategies for determining the
definition of abbreviations and acronyms, applying them as appropriate: (a) provide a definition or link (with the first occurrence)
of phrases, words, acronyms, and abbreviations specific to a particular community, (b) provide a summary for relationships that may
not be obvious from analyzing the structure of a table but that may be apparent in a visual rendering of the table, (c) if contracted
forms of words are used such that they are ambiguous, provide semantic markup to make words unique and interpretable, the tool successfully provides a repair within x seconds of the determination.
- Tester verifies that if it is determined that the content has not been reviewed, taking into account the strategies for evaluating the complexity of
content, applying as appropriate, the tool successfully provides a repair within x seconds of the determination.
- Tester verifies that if it is determined that a user can not select a different location for navigation elements in the layout of the page, the tool successfully provides a repair within x seconds of the determination.
- Tester verifies that if it is determined that the content has not been reviewed, taking into account common ideas from making content consistent and predictable,
applying as appropriate, the tool successfully provides a repair within x seconds of the determination.
- Tester verifies that if it is determined that a list of technologies and features, support for which is required in order for the content to be operable,
has not been determined and is not documented in metadata and/or a policy statement associated with the content, applying as appropriate, the tool successfully provides a repair with x seconds of the determination.
- Tester verifies that if it is determined that technologies and features on the required list are not available in at least two independently-developed implementations,
the tool successfully provides a repair within x seconds of the determination.
References
- ATAG 2.0 WD 21 Oct 2003
- WCAG 2.0 WD 27 Oct 2003