W3C home > Mailing lists > Public > www-qa-wg@w3.org > March 2004

Re: Outline for TestGL-Lite

From: Lynne Rosenthal <lynne.rosenthal@nist.gov>
Date: Wed, 03 Mar 2004 11:33:41 -0500
Message-Id: <5.1.0.14.2.20040303112328.02dbe7e8@mailserver.nist.gov>
To: Patrick Curran <Patrick.Curran@Sun.COM>, QAWG <www-qa-wg@w3.org>
I'm not sure I agree with the Principle 2: that  "test developers need to 
understand where their implementations are deficient and what they can do 
to fix it...To the extent that this is possible, tests should report what 
went wrong (what they were expecting, and what happened), as an aid to 
debugging the problem"

  For conformance tests, this is nice to have, but not necessary to 
determine conformance.  In the purest sense, a conformance test does not 
provide information beyond the pass/fail indication and what we are calling 
test metadata (description or purpose, test requirement and/or traceability 
back to the spec).  In fact, many test suites (within and external to W3C) 
do only this much.

I think providing extra information as to what went wrong is nice, but not 
necessary for conformance tests nor always possible (as you indicate).  So, 
I would categorize this as a good practice (i.e., recommended).


>
>Principle #2: Users of the test suite must understand how to interpret the 
>test results
>
>
>
>Implementation developers need to understand where their implementation is 
>deficient, and what they can do to fix it.
>
>This implies:
>    * Tests should report their results in a meaningful and consistent manner
>    * To the extent that this is possible, tests should report what went 
> wrong (what they were expecting, and what happened), as an aid to 
> debugging the problem

regards
Lynne
Received on Wednesday, 3 March 2004 11:34:25 GMT

This archive was generated by hypermail 2.2.0 + w3c-0.30 : Thursday, 9 June 2005 12:13:15 GMT