W3C home > Mailing lists > Public > www-qa-wg@w3.org > August 2003

Annotated minutes from TestGL discussion in Crete

From: Patrick Curran <Patrick.Curran@Sun.COM>
Date: Sun, 17 Aug 2003 17:42:34 -0700
To: QAWG <www-qa-wg@w3.org>
Message-id: <3F4020FA.5060607@sun.com>

GL 1 deals with analysis of the test suite.
CP1.1 Define test suite scope
Remove sentence: "Note, a WG may have multiple test suites for different

 >> DONE

CP1.1 and CP1.3 Are these redundant?
No. CP1.1 refers to the overall scope/goal. CP1.3 refers to the test
methodology applied to the whole test suite as well as applied to
components (parts) of the test suite.

GL 2 Identify and tag testable assertions
CP2.1 Is it a requirement that TAs be developed? Yes. Need to reword
conformance requirements to indicate that the TAs must come from the spec
if they are there, but they may be somewhere else. Make P1

 >> DONE

CP2.2 Metadata must be associated with test assertions
Rationale is not really a rationale, but could be a title. Also, do we
want SHOULD or MUST.
Make P1. Current 2.2 title is the rationale. Make the title, "Metadata
MUST be associated with TAs." Need to make active voice. Require a
minimum set of required metadata. The rest of the metadata could be either
a SHOULD in this CP, a P2 checkpoint or in ExTech. Decided to have a P2
checkpoint with a second set of metadata. Will need to develop a schema
for this. Also, indicate that these sets of metadata in not exhaustive.

 >> DONE

GL3 test cases management systems
Note that we assume that the tests exist, we donít discuss how they appear.
We go from TAs to managing the test cases. Should there be something
regarding the creation of tests. OpsGL addresses some of this. Missing a
guideline on performing QA on test cases do QA on your test materials.

 >> This is all covered in OpsGL (even QA - OpsGL includes the statement
 >> "the WG MUST define in its QA Process Document a procedure for reviewing
 >> test materials contributions, and the procedure MUST minimally address
 >> criteria for accuracy, scope, and clarity of the tests."
 >> How to handle this overlap?

CP3.2 rephrase to be similar to CP2.2, i.e., the notion of having a minium
set of metadata
CP3.4 Does this belong? This should reflect the storing of
results. Actually, it may be possible to remove, since this is obvious -
all test cases have an expected results. Saying this is O.K. Should this
be a separate checkpoint. Yes. Reword, You must have expected results
associated with each of your test cases. Make P1

 >> DONE

Do we need a better definition of a test case management system? The
concepts (functions) here are fuzzy, there is no clear line between what is
part of the management system and what is part of the framework. The
functions can be done in either systems. For example, selecting the
appropriate test cases can be done at build time (of the test suite) or at
run time (when executing the test suite). A concepts section would be

 >> TBD - transferred to issues list

GL4 Provide a test framework. (harness)
CP4.1 This should not be P1. The metadata and documentation enables the
development of a harness. A test harness is an (automated?) mechanism
that provides a consistent interface to testing. Harness refers to the
process of executing the tests.

 >> DONE

Missing checkpoints or a guideline on documentation. Documentation could
be part of the harness, but could also be part of other things, e.g.,
test cases

 >> TBD - transferred to issues list

CP4.2 Prototype the test framework
Prototype is the wrong word. A framework should not be built in
isolation. If you build a harness, then it should be beta tested and tried
out. Maybe relate this to the classes of products to ensure that the
harness runs on a wide variety of platfoms. The end of the rationale (run
on wide number of platforms) should be part of the conformance requirements.

 >> DONE

CP4.3 Automation of testing encouranged
Is this different from a harness, can paper be a test harness? A harness
provides a process (instructions) for executing the tests in a particular
sequence. Change.

Summary of GL4 discussion: Broadened into test execution process into 2
1. Document the process to execute tests,
2. If possible, it is desirable to automate. If automate then 4.2 follows.

 >> DONE

GL5 Test results
Merge CP5.1 and CP5.3. Record not just pass/fail, but other states that
There must be a well defined mechansim for reporting, this mechansim does
not have to be automated, but automation is desirable. Make the checkpoints
analagous to Guideline 4.

 >> DONE

GL6 Promote for conformance testing
Do we need this guideline since some of this is already in OpsGL. In
section 1.4 Relationship to other Framework documents, make it clear where
this picks up from OpsGL. Add CP6.1 to OpsGL, CP6.2 already in OpsGL.

Is it logical to divide into these 3 processes: test management system,
test framework and test reporting?

 >> Probably. See comments elsewhere about a Concepts section.

Within Test Materials, there are 3
partitions:Test Cases, Test Software, Test Documentation. Need to draft a
description of these partitions and provide a discussion of the licenses
with suggestions on how to apply the licenses, including a mention that
some things straddle the partitions.

 >> Licensing is covered in OpsGL. Another example of the overlap problem.

Review of Comments:
1. DMís comments apply to a previous TestGL draft:
DM-1.1 Agree

 >> DONE (removed reference to multiple test suites). Should we
 >> address modularization?

DM1.2 Need to make sure we cover versioning.

 >> Checkpoint 3.2 of OpsGL coveres versioning. More overlap. Nevertheless,
 >> added version info to list of metadata

Should metadata tie back to SpecGL DoV? We talk about filtering, but
should filtering on the DoV allowed by the specification be called out?
Yes. Want to include or exclude tests based on DoVs.

 >> This is addressed in the metadata section.

Need to include a test purpose or description.

 >> DONE

GL4: talking about test case review. Can say something about where get
test data from put this is anaylsis guideline.

 >> Test case review is covered in OpsGL. More overlap

CP5.4: should there be mention of storage of results? Covered in test
results management.
We donít mention anything about coverage. Need to add something about

 >> DONE

2. SMís comments 
GL4: concepts have been clarified.
CP4.3 Agreed. Add a sentence to clarify.

 >> DONE

CP5.1 Need to clarify and define terms. Use some of the suggested words.

 >> DONE

3. PFís comments
1. Addressed
2. Agreed

 >> DONE

3. need to look at intro


4. Relates to coverage and strategy. Will say something about coverage.

 >> Coverage is now addressed, but the broader question of differing 
 >> at different points in the development cycle has not been addressed. 
 >> this to issues list.

5, 6, 7. Agreed

 >> Ongoing review required.

Processing Plan.
WG draft by late September
Need ExTech document prior to first call
First Last Call: ??


Review of Introduction
Disagreement on whether a scope can be a set of requrements. Class of
products also included interoperability. Intended audience, delete
conformance. Remove or move last paragraph of 1.4. Sections 1.5, 1.6, 1.7
make sure that wording is consistent with other documents. Need to create
use cases for TestGL.

Review of Conformance and Definitions
Definitions need work. Conformance Section needs to be reviewed.

 >> Not yet addressed. Transferred to issues list.
Received on Sunday, 17 August 2003 20:43:48 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 22:43:34 UTC