draft review of TSDTF test samples assigned to me - my action item

NOTE: I based my review on the latest files found in:

http://www.w3.org/2006/tsdtf/TestSampleStatusList

Apologies for any errors or omissions..  Also this is mostly structure 
review, I did get into
some content issues..  I had quite a few questions also, which are noted..

Thanks and best wishes
Tim Boland NIST

First general comments, and then more specific comments.

GENERAL COMMENTS:

(1) Need a way to handle versioning of WCAG drafts and techniques
  in the metadata -  the techniques links in the metadata are dated
from the May WCAG draft, even though the December draft has been published 
and the
WCAG techniques links have been updated?

(2) Need a way to give the title of the SC, not just the number, in the
metadata, because the number may change between successive WCAG drafts due to
addition/removal of SCs between versions, so the referenced number may become
incorrect.  I know, because I recently updated the WCAG1-WCAG2 mapping, and
found that older SC numbers had changed, or were deleted, from earlier WCAG
versions

(3) there needs to be some mention of what software is needed to correctly play
the files in the testfiles.  One of the criteria is that the testfiles 
"work correctly",
but that is partially dependent upon the users' environment and installed 
software?
What is the test environment for these tests?

(4) In the structure review following, does the "test files" portion refer 
just to the
actual test file, or to both the metadata file and the test file?  The 
distinction can be
confusing - for example, there are links in both the metadata file and the 
test file, so
the checklist for "links working correctly" can be applied to both 
files?  Similarly, in
the "metadata" portion, the checklists for "titles being accurate" can be 
applied to both
metadata file and test files?    Should the labelling of the structure 
review match progression
through first the metadata file and then the associated test file (if not 
already done)?

(5) The naming convention for these files is "l1", etc., for levels of the 
WCAG SCs, but WCAG SCs
now use levels A, AA, AAA, so this may be confusing?

---------------------------------------

SPECIFIC REVIEWS:

----------------------------------------

Structure Review for Test Sample sc1.2.1_l1_001

Contact Information

Review Criteria - name and email address of the submitter are available
Review Result - fail?
Comment - I could not find it in the metadata file?

Review Criteria - organization on whose behalf the test sample was submitted
Review Result - pass
Comment - I found it in the metadata file (ERTWG?)

Test Files

Review Criteria - all the files that are necessary to execute the test 
procedure
have been submitted
Review Result - pass
Comment - looks like everything's there in the testfile itself?

Review Criteria - all the submitted files follow the naming convention and 
directory
structure
Review Result - not sure?
Comments - in terms of naming convention, for metadata file, the name 
"sc1.2.1_l1_001.xml" seems to comply
(except for "l1"? - see general comment #5 previous) - also what happens if 
sc1.2.1 becomes something
new as a result of new version of WCAG - see general comment #2 previous) ;
  the actual testfile
"sc1.2.1_l1_001.html" contains "html" as the file type (but the "primary" 
technology is xhtml from the
doctype?) - what should the relationship be between file type and "primary 
(what does that mean)"
technology??
In terms of directory structure, the metadata file seems to comply (the 
"xhtml/metadata" part)
but are subdirectories allowed in this structure?  The actual test file has 
"html" listed as a file type
this seems OK, but what does this imply in term of "primary technology" 
(doctype is "xhtml1/strict")?
   The "xhtml/testfiles" part of the path seems OK, but
then there is a subdirectory "resources/video.." which seems inconsistent 
with the listing under
"directory structure"?  Also, under "testfiles" in the process document, 
there are "resource" subdirectories,
but after "testfiles" in this case there is the actual files" - is this 
inconsistent (should there
be a "resource" part before the actual file for consistency)?

Review Criteria - all the files include valid markup unless otherwise 
required by the test
Review Result - fail?
Comments - metadata file validates as "well formed XML (1 warning)" 
according to the W3C validator,not all the files validate according to the 
W3C validator (may
need to document the exceptions?).  The actual testfile fails validation 
with 4 errors (according
to the W3C validator).

Review Criteria - all the files include correct links unless otherwise 
required by the test
Review Result - cannot tell?
Comments - need to check all the links?  Checked a few in the metadata 
file, and they seemed OK,
but need to check all the schema links for correctness..  What is the 
definition of a "correct link"?
Video in testfile seemed to play OK for me, and I got the sound OK ..
but some of the buttons were "grayed out" (last four)
- also should there be a "test purpose" somehow included in the html file - 
there were no captions
in the video file for me but someone watching may forget what the metadata 
file says..

Review Criteria - all the files include correct spelling unless otherwise 
required by the test
Review Result - pass?
Comments - could not find any spelling errors, but again, what is 
definition of "correct spelling"?
Which dictionary is being used?


Metadata

Review Criteria - all the dates and other integer or literal values have 
the correct format
Review Result - cannot tell?
Comments - what is the definition of "correct format"?

Review Criteria - all static values (especially copyright notices) are 
included and accurate
Review Result - pass
Comments - no comment - seem to be included in metadata file

Review Criteria - all titles, descriptions, and other required fields are 
included and accurate
Review Result - cannot tell
Comments - what is definition of "accurate"?  There is a "title" tag in the 
metadata file, which
accurately says "a video with no captions", but in the actual testfile 
there is a "title" tag that
says "prerecorded multimedia with captions", which does not seem accurate 
and is a contradiction
with the title tag in the metadata (I did not notice any captions in the 
video when played)..
.

Review Criteria - all identifiers (especially ID for techniques and rules) 
are used correctly
Review Result - not sure?
Comments - aside from the id="sc1.2.1_l1_001" in the "xmlns:btw" (which 
seems OK?),
  the only id attribute I found in the metadata file was for the technique 
tag, which
contained "F8" (I think there should be a technique description as well - 
see my earlier comments -
rather than just "F8", because if techniques change "F8" may no longer be 
correct, and it may be
difficult to determine correctness in the future - document management 
issue)..Also the "may"
reference is no longer correct, because there is now a "december" 
reference.. No id attributes
found in html testfile..  Same goes for the "F8" reference after "location" 
tag under "rule" element previous - there
is also a "may" reference there.. the "F8" here has no path to give it 
context.  Why is "F8" listed
twice under "rule" element?  Doesn't seem to add anything..  SIDE NOTE: I'm 
not sure that this test
adequately tests F8, because F8 seems to assume the initial 
presence/availability of captions, and
this test has no captions to begin with..?

Review Criteria - all structures such as rules, techniques, or pointers are 
used correctly
Review Result - cannot tell?
Comments - what is definition of "used correctly"?  See previous comment on 
technique structure..
Other pointers seem OK in metadata file, but perhaps I missed 
something?  In light of recent telecon discussion on
primary rule, perhaps this part will change?  Doctype and namespace 
"pointers" in html testfile are "xhtml"..  What happens
if my environment doesn't support "wmv"?  I compared metadata in metadata 
file against metadata in
process document..  Two "technical specs" sections - is the "testelement" 
section still correct in
light of recent WCAG evolution?  The "complexity" attribute on the 
"testcase" element - is that
still important in light of recent telecon discussion on that 
attribute?  In the process document,
it says that required "file" element MUST contain one of four "optional" 
choices (wording seems
confusing - what if none of options are included, as in this example? - 
what happens to MUST?)
Everything else seems OK..


-------------------------------------------------------------------------------

Structure Review for Test Sample sc1.2.1_l1_002

Contact Information

Review Criteria - name and email address of the submitter are available
Review Result - fail?
Comment - I could not find it in the metadata file?

Review Criteria - organization on whose behalf the test sample was submitted
Review Result - pass
Comment - I found it in the metadata file (ERTWG?)

Test Files

Review Criteria - all the files that are necessary to execute the test 
procedure
have been submitted
Review Result - pass
Comment - looks like everything's there in the testfile itself?

Review Criteria - all the submitted files follow the naming convention and 
directory
structure
Review Result - not sure?
Comments - in terms of naming convention, for metadata file, the name 
"sc1.2.1_l1_002.xml" seems to comply
(except for "l1"? - see general comment #5 previous) - also what happens if 
sc1.2.1 becomes something
new as a result of new version of WCAG - see general comment #2 previous) ;
  the actual testfile
"sc1.2.1_l1_002.html" contains "html" as the file type (but the "primary" 
technology is xhtml from the
doctype?) - what should the relationship be between file type and "primary 
(what does that mean)"
technology?
In terms of directory structure, the metadata file seems to comply (the 
"xhtml/metadata" part)
but are subdirectories allowed in this structure?  The actual test file has 
"html" listed as a file type
this seems OK, but what does this imply in term of "primary technology" 
(doctype is "xhtml1/strict")?
   The "xhtml/testfiles" part of the path seems OK, but
then there is a subdirectory "resources/video.." which seems inconsistent 
with the listing under
"directory structure"?  Also, under "testfiles" in the process document, 
there are "resource" subdirectories,
but after "testfiles" in this case there is the actual files" - is this 
inconsistent (should there
be a "resource" part before the actual file for consistency)?

Review Criteria - all the files include valid markup unless otherwise 
required by the test
Review Result - fail?
Comments - metadata file validates as "well formed XML (1 warning)" 
according to the W3C validator,not all the files validate according to the 
W3C validator (may
need to document the exceptions?).  The actual testfile fails validation 
with 4 errors (according
to the W3C validator).

Review Criteria - all the files include correct links unless otherwise 
required by the test
Review Result - cannot tell?
Comments - need to check all the links?  Checked a few in the metadata 
file, and they seemed OK,
but need to check all the schema links for correctness..  What is the 
definition of a "correct link"?
Video in testfile seemed to play OK for me, and I got the sound OK ..
but some of the buttons were "grayed out" (last four)
- also should there be a "test purpose" somehow included in the html file - 
there were captions at the bottom
of the movie
in the video file for me but someone watching may forget what the metadata 
file says..

Review Criteria - all the files include correct spelling unless otherwise 
required by the test
Review Result - pass?
Comments - could not find any spelling errors, but again, what is 
definition of "correct spelling"?
Which dictionary is being used?


Metadata

Review Criteria - all the dates and other integer or literal values have 
the correct format
Review Result - cannot tell?
Comments - what is the definition of "correct format"?

Review Criteria - all static values (especially copyright notices) are 
included and accurate
Review Result - pass
Comments - no comment - seem to be included in metadata file

Review Criteria - all titles, descriptions, and other required fields are 
included and accurate
Review Result - cannot tell
Comments - what is definition of "accurate"?  There is a "title" tag in the 
metadata file, which
accurately says "a video with captions", and in the actual testfile there 
is a "title" tag that
says "video with captions", which seems consistent and accurate according 
to the actual video..
.

Review Criteria - all identifiers (especially ID for techniques and rules) 
are used correctly
Review Result - not sure?
Comments - aside from the id="sc1.2.1_l1_002" in the "xmlns:btw"the only id 
attribute I found in the metadata file was for the technique tag, which
contained "G93" (I think there should be a technique description as well - 
see my earlier comments -
rather than just "G93", because if techniques change "G93" may no longer be 
correct, and it may be
difficult to determine correctness in the future - document management 
issue)..Also the "may"
reference is no longer correct, because there is now a "december" 
reference.. No id attributes
found in html testfile..  Same goes for the "G93" reference after 
"location" tag under "rule" element previous - there
is also a "may" reference there.. the "G93" here has no path to give it 
context.  Why is "G93" listed
twice under "rule" element?  Doesn't seem to add anything.. Also, why is 
expected result="cannot tell" when
when under "purpose" tag previous it says "test case intended to 
pass.."?  Minor nit - the order of
the attributes in this "technique" tag is different from the order in the 
metadata file previous, and
the xlink path name is slightly different.. SIDE NOTE: At least from the 
description in G93, this
test seems to have the same scope..

Review Criteria - all structures such as rules, techniques, or pointers are 
used correctly
Review Result - cannot tell?
Comments - what is definition of "used correctly"?  See previous comment on 
technique structure..
Other pointers seem OK in metadata file, but perhaps I missed 
something?  In light of recent telecon discussion on
primary rule, perhaps this part will change?  Doctype and namespace 
"pointers" in html testfile are "xhtml"..  What happens
if my environment doesn't support "wmv"?  I compared metadata in metadata 
file against metadata in
process document..  Two "technical specs" sections - is the "testelement" 
section still correct in
light of recent WCAG evolution?  The "complexity" attribute on the 
"testcase" element - is that
still important in light of recent telecon discussion on that 
attribute?  In the process document,
it says that required "file" element MUST contain one of four "optional" 
choices (wording seems
confusing - even though first option is included in this example, wording 
in metadata document
seems confusing - what happens to MUST?)
Everything else seems OK..






-------------------------------------------------------------------


Structure Review for Test Sample sc1.2.5_l3_001

Contact Information

Review Criteria - name and email address of the submitter are available
Review Result - fail?
Comment - I could not find it in the metadata file?

Review Criteria - organization on whose behalf the test sample was submitted
Review Result - pass
Comment - I found it in the metadata file (ERTWG?)

Test Files

Review Criteria - all the files that are necessary to execute the test 
procedure
have been submitted
Review Result - pass
Comment - looks like everything's there in the testfile itself (I ran the 
video using
  RealPlayer - all the controls worked for me).  What happens if I don't 
have RealPlayer -
what software would I need to correctly run the video?

Review Criteria - all the submitted files follow the naming convention and 
directory
structure
Review Result - not sure?
Comments - in terms of naming convention, for metadata file, the name 
"sc1.2.5_l3_001.xml" seems to comply
(except for "l3"? - see general comment #5 previous) - also what happens if 
sc1.2.5 becomes something
new as a result of new version of WCAG - see general comment #2 previous) ;
  the actual testfile
"sc1.2.5_l3_001.html" contains "html" as the file type (but the "primary" 
technology is xhtml from the
doctype?) - what should the relationship be between file type and "primary 
(what does that mean)"
technology?
In terms of directory structure, the metadata file seems to comply (the 
"xhtml/metadata" part)
but are subdirectories allowed in this structure?  The actual test file has 
"html" listed as a file type
this seems OK, but what does this imply in term of "primary technology" 
(doctype is "xhtml1/strict")?
   The "xhtml/testfiles" part of the path seems OK, but
then there is a subdirectory "resources/video.." which seems inconsistent 
with the listing under
"directory structure"?  Also, under "testfiles" in the process document, 
there are "resource" subdirectories,
but after "testfiles" in this case there is the actual files" - is this 
inconsistent (should there
be a "resource" part before the actual file for consistency)?

Review Criteria - all the files include valid markup unless otherwise 
required by the test
Review Result - pass?
Comments - metadata file validates as "well formed XML (1 warning)" 
according to the W3C validator,not all the files validate according to the 
W3C validator (may
need to document the exceptions?).  The actual testfile validates as 
XHTML1.0 Strict (according
to the W3C validator).

Review Criteria - all the files include correct links unless otherwise 
required by the test
Review Result - cannot tell?
Comments - need to check all the links?  Checked a few in the metadata 
file, and they seemed OK,
but need to check all the schema links for correctness..  What is the 
definition of a "correct link"?
Video in testfile seemed to play OK for me (the Windows Media link worked),
  and I got the sound OK .. all the controls worked for me, and
the ViSiCast link worked as well
- also should there be a "test purpose" somehow included in the html file - 
there was no sign language that I could
see in the video but someone watching may forget what the metadata file 
says..

Review Criteria - all the files include correct spelling unless otherwise 
required by the test
Review Result - pass?
Comments - could not find any spelling errors, but again, what is 
definition of "correct spelling"?
Which dictionary is being used?


Metadata

Review Criteria - all the dates and other integer or literal values have 
the correct format
Review Result - cannot tell?
Comments - what is the definition of "correct format"?

Review Criteria - all static values (especially copyright notices) are 
included and accurate
Review Result - pass
Comments - no comment - seem to be included in metadata file

Review Criteria - all titles, descriptions, and other required fields are 
included and accurate
Review Result - cannot tell
Comments - what is definition of "accurate"?  There is a "title" tag in the 
metadata file, which
accurately says that "the video contains no sign language interpretation", 
and in the
actual testfile there is a "title" tag that just
says "sign language interpretation", which seems inconsistent (with 
metadata title tag)
and inaccurate to me, since it implies that the accompanying video will 
have sign language
interpretation..


Review Criteria - all identifiers (especially ID for techniques and rules) 
are used correctly
Review Result - not sure?
Comments - aside from the id="sc1.2.5_l3_001" in the "xmlns:btw"I could not 
find any id attributes in the metadata file (and there is no technique tag,
as is required).  Does this test map to a technique?
No id attributes
found in html testfile..  Also, why is expected result="cannot tell" when
when under "purpose" tag previous it says "test case intended to fail.."?

Review Criteria - all structures such as rules, techniques, or pointers are 
used correctly
Review Result - cannot tell?
Comments - what is definition of "used correctly"?  See previous comment on 
technique structure..
Other pointers seem OK in metadata file, but perhaps I missed 
something?  In light of recent telecon discussion on
primary rule, perhaps this part will change?  Doctype and namespace 
"pointers" in html testfile are "xhtml"..  What happens
if my environment doesn't support "wmv"?  I compared metadata in metadata 
file against metadata in
process document..  Two "technical specs" sections with no test elements 
subsections?
SIDE NOTE: I'm seeing "R" and "TM" marks in all these documents - is this 
consistent with licensing and
public availability issues?  The "complexity" attribute on the "testcase" 
element - is that
still important in light of recent telecon discussion on that attribute 
(says "atomic" but there are two
technical specs)?  In the process document,
it says that required "file" element MUST contain one of four "optional" 
choices (wording seems
confusing - even though none of options is included in this example, 
wording in metadata document
seems confusing - what happens to MUST?)  There is no "technique" tag as is 
required..  Also the
reference in the rule tag is to "may" not "december"..  Also inconsistency 
between "expected result"
  and "purpose"?
Everything else seems OK..







-------------------------------------------------------------------------------

Structure Review for Test Sample sc1.2.7_l3_001


Contact Information

Review Criteria - name and email address of the submitter are available
Review Result - fail?
Comment - I could not find it in the metadata file?

Review Criteria - organization on whose behalf the test sample was submitted
Review Result - pass
Comment - I found it in the metadata file (ERTWG?)

Test Files

Review Criteria - all the files that are necessary to execute the test 
procedure
have been submitted
Review Result - pass
Comment - looks like everything's there in the testfile itself?

Review Criteria - all the submitted files follow the naming convention and 
directory
structure
Review Result - not sure?
Comments - in terms of naming convention, for metadata file, the name 
"sc1.2.7_l3_001.xml" seems to comply
(except for "l3"? - see general comment #5 previous) - also what happens if 
sc1.2.7 becomes something
new as a result of new version of WCAG - see general comment #2 previous) ;
  the actual testfile
"sc1.2.7_l3_001.html" contains "html" as the file type (but the "primary" 
technology is xhtml from the
doctype?) - what should the relationship be between file type and "primary 
(what does that mean)"
technology?
In terms of directory structure, the metadata file seems to comply (the 
"xhtml/metadata" part)
but are subdirectories allowed in this structure?  The actual test file has 
"html" listed as a file type
this seems OK, but what does this imply in term of "primary technology" 
(doctype is "xhtml1/strict")?
   The "xhtml/testfiles" part of the path seems OK, but
then there is a subdirectory "resources/video.." which seems inconsistent 
with the listing under
"directory structure"?  Also, under "testfiles" in the process document, 
there are "resource" subdirectories,
but after "testfiles" in this case there is the actual files" - is this 
inconsistent (should there
be a "resource" part before the actual file for consistency)?   What is the 
naming convention for
the href="sc1.2.7_l3_001_01.html (this html file and the referencing html 
file - may be confusing)?

Review Criteria - all the files include valid markup unless otherwise 
required by the test
Review Result - pass?
Comments - metadata file validates as "well formed XML" according to the 
W3C validator.
The actual testfile validates as XHTML1.0 Strict (according to the W3C 
validator).

Review Criteria - all the files include correct links unless otherwise 
required by the test
Review Result - cannot tell?
Comments - need to check all the links?  Checked a few in the metadata 
file, and they seemed OK,
but need to check all the schema links for correctness..  What is the 
definition of a "correct link"?
The "video(windows media 04.mb)" link seemed to work OK (does this imply 
that I need windows media
instaalled for the link to work correctly?).  Also, the "full multimedia.." 
link immediately
following worked OK.  Should this link be presented before the actual video 
rather than after
(so that someone will know in advance of the availability of the 
alternative)?  Also, if someone
cannot see the video, how will they know what video content they actually 
missed (to run the test)?
Video in testfile seemed to play OK for me, and I got the sound OK (used 
RealPlayer - what would
happen if I used other software?..

- also should there be a "test purpose" somehow included in the html file - 
there was a separate alternative
link after the movie
  but someone accessing the test file may forget what the metadata file 
says..

Review Criteria - all the files include correct spelling unless otherwise 
required by the test
Review Result - pass?
Comments - could not find any spelling errors, but again, what is 
definition of "correct spelling"?
Which dictionary is being used?


Metadata

Review Criteria - all the dates and other integer or literal values have 
the correct format
Review Result - cannot tell?
Comments - what is the definition of "correct format"?

Review Criteria - all static values (especially copyright notices) are 
included and accurate
Review Result - pass
Comments - no comment - seem to be included in metadata file

Review Criteria - all titles, descriptions, and other required fields are 
included and accurate
Review Result - cannot tell
Comments - what is definition of "accurate"?  There is a "title" tag in the 
metadata file, which
accurately says "video with auditory information in text alternative", and 
in the actual testfile there is a "title" tag that
says "video with auditory information in text alternative", which seems 
consistent and
accurate according to the actual video..
.

Review Criteria - all identifiers (especially ID for techniques and rules) 
are used correctly
Review Result - not sure?
Comments - aside from the id="sc1.2.7_l3_001" in the "xmlns:btw", there 
were no other
  id attributes I found in the metadata file .. Also the "may"
reference after "rule" element is no longer correct, because there is now a 
"december" reference..
No techniques referenced, which is a problem?  No id attributes
found in html testfile..  Also, why is expected result="cannot tell" when
when under "purpose" tag previous it says "document (test case?) is 
intended to fail.."?  Minor nit -
can a document fail or a test case on that document fail?   Also, the 
language in "dc-description" and
"purpose" tags seems stilted and confusing (mixing "auditory" and "video", 
as well as "content" and
"information"?)

Review Criteria - all structures such as rules, techniques, or pointers are 
used correctly
Review Result - cannot tell?
Comments - what is definition of "used correctly"?
Other pointers seem OK in metadata file, but perhaps I missed 
something?  In light of recent telecon discussion on
primary rule, perhaps this part will change?  Pointers in html testfile are 
"xhtml".. consistent/confusing?
  What happens
if my environment doesn't support "wmv"?  I compared metadata in metadata 
file against metadata in
process document..   The "complexity" attribute on the "testcase" element - 
is that
still important in light of recent telecon discussion on that 
attribute?  In the process document,
it says that required "file" element MUST contain one of four "optional" 
choices (wording seems
confusing - even though none of the options is included in this example, 
wording in metadata document
seems confusing - what happens to MUST?)  Also, there is no "techniques" 
tag or "technique" tag,
which are "required" according to metadata document..?
Everything else seems OK..

Received on Thursday, 31 January 2008 14:59:19 UTC