RE: draft review of TSDTF test samples assigned to me - my action item


Some additional comments below:

At 20:02 4/02/2008, Carlos Iglesias wrote:

>Hi Tim,
>Some inline comments about the first review, but 
>I think that applicable to all of them:
> >
> > (1) Need a way to handle versioning of WCAG drafts and techniques
> >   in the metadata -  the techniques links in the metadata are dated
> > from the May WCAG draft, even though the December draft has been published
> > and the
> > WCAG techniques links have been updated?

All references to WCAG documents are references 
to dated versions, in other words, not to WCAG or 
the Techniques doc generally but to the version that was published on date X.
If the references did not point to dated 
versions, it would require some detective work to 
find out on which version a test sample (and its 
test/fail statement) is based, before one could 
check the accuracy of the metadata. With the 
current approach (dated versions) a statement in 
the metadata remains correct (and can be checked 
after following the links), even though the 
relevance of the statement decreases when newer versions of WCAG are published.
Since WCAG has changed considerably since the 
start of this task force's activity, I think this was a good approach.

(Even if the references worked with generic 
instead of dated links, it would still be 
necessary to check the test samples after each new draft of WCAG 2.0.)

> >
> > (2) Need a way to give the title of the SC, not just the number, in the
> > metadata, because the number may change between successive WCAG drafts due
> > to
> > addition/removal of SCs between versions, so the referenced number may
> > become
> > incorrect.  I know, because I recently updated the WCAG1-WCAG2 mapping,
> > and
> > found that older SC numbers had changed, or were deleted, from earlier
> > WCAG
> > versions
>I would also add that we also need an updating 
>plan for keeping metadata consistence each time 
>WCAG is updated (also related to the next point).

I admit it would be convenient to have the titles 
of the success criteria available, but I would 
prefer to have them in the HTML view instead of 
the XML version of the metadata. In other words, 
I would prefer a mechanism to import those data 
in the transformation process from TCDL XML to 
HTML. Putting the titles in the XML files would cost us considerably more time.

The issue of using SC numbers vs SC IDs will come 
up in the discussion on the naming convention. See the summary at

> > (3) there needs to be some mention of what software is needed to correctly
> > play
> > the files in the testfiles.  One of the criteria is that the testfiles
> > "work correctly",
> > but that is partially dependent upon the users' environment and installed
> > software?
> > What is the test environment for these tests?
>That's a good point.

technicalSpec/@baseline can be used for this.
See <>.
This provides a way of defining which formats are 
assumed to be supported at the client's side, rather than specifying software.
(For software, TCDL has 
but the Task Force rejected the use of that part of TCDL.)

> > (4) In the structure review following, does the "test files" portion refer
> > just to the
> > actual test file, or to both the metadata file and the test file?  The
> > distinction can be
> > confusing - for example, there are links in both the metadata file and the
> > test file, so
> > the checklist for "links working correctly" can be applied to both
> > files?  Similarly, in
> > the "metadata" portion, the checklists for "titles being accurate" can be
> > applied to both
> > metadata file and test files?    Should the labelling of the structure
> > review match progression
> > through first the metadata file and then the associated test file (if not
> > already done)?
>My understanding is that the "test files" 
>section is intended to refer only to the actual 
>test files (not metadata) and the "metadata" is 
>intended to refer only the metadata file.

Yes, "test files" refers only to the actual test files.
I propose that we add to the metadata section 
that links in the XML metadata also need to be checked.

> > (5) The naming convention for these files is "l1", etc., for levels of the
> > WCAG SCs, but WCAG SCs
> > now use levels A, AA, AAA, so this may be confusing?
>Don't think soo, they are just files names and I 
>think that xxxx_lAA_xxxx is not going to be much more significative.

l1 used to refer to Level 1 but now refers to Level A, etcetera.
Should we make this explicit in our metadata document?

>Additionally I think l1, l2 and l3 is also valid 
>as we have 3 levels that are named whatever. As 
>a final point I don't think we are at time of changing the naming convention.
> > ---------------------------------------
> >
> >
> > ----------------------------------------
> >
> > Structure Review for Test Sample sc1.2.1_l1_001
> >
> > Contact Information
> >
> > Review Criteria - name and email address of the submitter are available
> > Review Result - fail?
> > Comment - I could not find it in the metadata file?
>They are at the Test Sample Status List
>We may need to clarify this in the structure review process

That sounds like a good idea.

>Is there a need to have them in the metadata file?

I believe we rejected this in earlier 
discussions, but I don't have a reference.

> > Review Criteria - organization on whose behalf the test sample was
> > submitted
> > Review Result - pass
> > Comment - I found it in the metadata file (ERTWG?)
>This information is also in the Test Simple Status List.
>Until now the submitter is always BTW, not ERTWG.
> >[...]
> >   the actual testfile
> > "sc1.2.1_l1_001.html" contains "html" as the file type (but the "primary"
> > technology is xhtml from the
> > doctype?) - what should the relationship be between file type and "primary
> > (what does that mean)"
> > technology??
> > In terms of directory structure, the metadata file seems to comply (the
> > "xhtml/metadata" part)
> > but are subdirectories allowed in this structure?  The actual test file
> > has
> > "html" listed as a file type
> > this seems OK, but what does this imply in term of "primary technology"
> > (doctype is "xhtml1/strict")?
>Not sure what you mean with "primary 
>technology". I don't remember we use this terminology anywhere.

Same here. The technologies section in the metadata lists "XHTML", not "HTML".

> >    The "xhtml/testfiles" part of the path seems OK, but
> > then there is a subdirectory "resources/video.." which seems inconsistent
> > with the listing under
> > "directory structure"?  Also, under "testfiles" in the process document,
> > there are "resource" subdirectories,
> > but after "testfiles" in this case there is the actual files" - is this
> > inconsistent (should there
> > be a "resource" part before the actual file for consistency)?
>I think that the only inconsistence here is that 
>the "video" subdirectory is directly under the 
>"testfiles" one, and the rest of resources 
>(applets, audio, flash...) are under the "resources" directory.
>According to the "Directory Structure" section 
>of the "Using TCDL" document is the rest of the 
>world (applets, audio, flash...) and not video who are in the wrong place.

Something strange happened here.
I *thought* we had agreed to use the structure 
"testfiles/resources/video" (same as in BenToWeb) 
instead of "testfiles/video", but the metadata 
document describes the latter structure.
Can somebody go back and check what we decided and when?

> > Review Criteria - all the files include valid markup unless otherwise
> > required by the test
> > Review Result - fail?
> > Comments - metadata file validates as "well formed XML (1 warning)"
> > according to the W3C validator,not all the files validate according to the
> > W3C validator (may
> > need to document the exceptions?).  The actual testfile fails validation
> > with 4 errors (according
> > to the W3C validator).
>IMO It's difficult here to say if the validation 
>errors are required by the test or not, as the test is about captions.

It's not a matter of validation errors being 
*required* but of explaining why validation 
errors are present. As a result of a previous 
review, I got an actio item 
to document the validation errors.
So I added the following sentence to the purpose: 
"The test case is only about the availability of 
captions, not about valid code (object contains 
an embed element as a fallback)."

When a reviewer finds validation errors, the 
purpose needs to be checked for an explanation.

If this is insufficient, please comment.

> > Review Criteria - all the files include correct links unless otherwise
> > required by the test
> > Review Result - cannot tell?
> > Comments - need to check all the links?  Checked a few in the metadata
> > file, and they seemed OK,
> > but need to check all the schema links for correctness..  What is the
> > definition of a "correct link"?
>I think we need to check all of them (although 
>several may be done in an automatic way)
>My interpretation of correct link is one that is 
>not broken (unless it's required by the test 
>purpose) and get to what is expected at first.

This item is about the actual test files; in 
these test files all links (including CSS and JavaScript) need to be checked.

> > Video in testfile seemed to play OK for me, and I got the sound OK ..
> > but some of the buttons were "grayed out" (last four)
> > - also should there be a "test purpose" somehow included in the html file
> > -
> > there were no captions
> > in the video file for me but someone watching may forget what the metadata
> > file says.

The test file was intended to fail the SC because of the missing captions.
The new HTML view should make the review easier.

>It supposed that metadata is somehow part of the 
>test, so you shouldn't ignore it. Anyway the 
>plan is also to expose some of the metadata.
> > Review Criteria - all the files include correct spelling unless otherwise
> > required by the test
> > Review Result - pass?
> > Comments - could not find any spelling errors, but again, what is
> > definition of "correct spelling"?
> > Which dictionary is being used?
>I wouldn't complicate too much things regarding 
>spelling. I vote just for an "obvious misspelling" criterion.

Right. But do we allow both American and British 
English? (And Canadian English?)
Some test files, especially for SC 3.1.1 and SC 
3.1.2 use languages other than English (but those haven't been submitted yet).

> > Metadata
> >
> > Review Criteria - all the dates and other integer or literal values have
> > the correct format
> > Review Result - cannot tell?
> > Comments - what is the definition of "correct format"?
>We have discussed this before. It seems obvious 
>that we should include this information directly in the documentation.

Those values were available in earlier versions 
of the metadata document but were removed for 
reasons of usability. The example file at 
<> can be used instead.

> > Review Criteria - all titles, descriptions, and other required fields are
> > included and accurate
> > Review Result - cannot tell
> > Comments - what is definition of "accurate"?  There is a "title" tag in
> > the
> > metadata file, which
> > accurately says "a video with no captions", but in the actual testfile
> > there is a "title" tag that
> > says "prerecorded multimedia with captions", which does not seem accurate
> > and is a contradiction
> > with the title tag in the metadata (I did not notice any captions in the
> > video when played)...
>I also noted that there's a certain grade of 
>inconsistence between metadata titles and the 
>test sample titles, and I agree than in some 
>cases this inconsistence may be relevant.
>May consistence between both titles be required?

There is no requirement that the metadata titles 
and the test file titles be identical.
In BenToWeb, where we did end-user testing, my 
position was that the test file titles mustn't be 
suggestive, in order to avoid that they influence 
the user's feedback, but in the task force there is no such restriction.
However, with regard to metadat titles, my 
position is that they must be descriptive instead 
of "prescriptive", i.e. I don't like titles like 
"All data table should have a summary attribute" 
(that doesn't say anything about the table in the 
test sample: does it have a summary attribute or not?).

> >   the only id attribute I found in the metadata file was for the technique
> > tag, which
> > contained "F8" (I think there should be a technique description as well -
> > see my earlier comments -
> > rather than just "F8", because if techniques change "F8" may no longer be
> > correct, and it may be
> > difficult to determine correctness in the future - document management
> > issue)...
> > Also the "may"
> > reference is no longer correct, because there is now a "december"
> > reference.. No id attributes
> > found in html testfile..  Same goes for the "F8" reference after
> > "location"
> > tag under "rule" element previous - there
> > is also a "may" reference there.. the "F8" here has no path to give it
> > context.

Each metadata file has an ID attribute on the 
document element (testCaseDescription/@id); this 
needs to be consistent with the naming convention.

>That's true, but I think our main issue here is 
>been "condemned" to work with such unstable 
>references. I think that it's really useful and 
>make sense to have SC and Techniques solid 
>references, is there anything the WCAG WG can do on the subject?

See my first comment in this message: solid in 
what regard? Dated versions are "solid" (stable), 
but in a different way than generic references.

> >[...]
> > What happens
> > if my environment doesn't support "wmv"?
>Good point!

See my comment regarding technologies/@baseline.

>Additionally, what about the copyright of the 
>videos, how can we be sure about it?

By submitting them to the W3C, W3C licenses apply.
This is one of the points that would need to be 
mentioned in the test case submission interface, 
which we discussed a long time ago.

>   I compared metadata in metadata
> > file against metadata in
> > process document..  Two "technical specs" sections - is the "testelement"
> > section still correct in
> > light of recent WCAG evolution?

In what sense would it be incorrect?

>It was there waiting for the "Baseline concept" 
>evolution. Maybe it's time to take a final 
>decision on this (IMO it doesn't make sense any 
>more due to the evolution that the Baseline concept has experimented).
> >The "complexity" attribute on the
> > "testcase" element - is that
> > still important in light of recent telecon discussion on that
> > attribute?
>No, it shouldn't.

Complexity attributes will be removed: action 
item at <> ;-)

> > In the process document,
> > it says that required "file" element MUST contain one of four "optional"
> > choices (wording seems
> > confusing - even though none of the options is included in this example,
> > wording in metadata document
> > seems confusing - what happens to MUST?)
>What do you mean with "the process document"?
>I think that both, the metadata document 
>[] and 
>the TCDL specification read that the file 
>element CAN contain one of four "optional" choices.

For simple HTTP request,

<file xlink:href="../testfiles/sc1.2.1_l1_001.html" />

should become


Best regards,


>That's all as far as I am concerned.
>  CI.
>Carlos Iglesias
>Fundación CTIC
>Parque Científico-Tecnológico de Gijón
>33203 - Gijón, Asturias, España
>teléfono: +34 984291212
>fax: +34 984390612

Please don't invite me to LinkedIn, Facebook, 
Quechup or other "social networks". You may have 
agreed to their "privacy policy", but I haven't.

Christophe Strobbe
K.U.Leuven - Dept. of Electrical Engineering - SCD
Research Group on Document Architectures
Kasteelpark Arenberg 10 bus 2442
B-3001 Leuven-Heverlee
tel: +32 16 32 85 51 


Received on Tuesday, 19 February 2008 18:17:40 UTC