TestGL Issues List

Num Date Class Status Raised By Owner
001 Jan 04 Substantive Unassigned QAWG TBD
Section: Introduction
Title: Need introductory material
Description: Need introductory material with structure similar to other Framework docs

Proposal: Class Of Product is test materials (in later discussion we also realized that we must include test metadata, else we will not be able to address any of the "process issues"). Audience is (in priority order) test developers, WG members, and users of tests.

DD and PC hoped there would be no need to define test metadata as a separate COP; they would rather define 'test materials' broadly enough to cover these metadata. On the other hand, including in the term "test materials" a test plan and the results of executing that plan to test those test materials (as in checkpoint 5.4) seems to stretch the definition too far (not to mention introduce a confusing element of recursion).

Resolution:  
002 Jan 04 Substantive Unassigned QAWG TBD
Section: Concepts - Types of testing
Title: Clarify and expand types of testing
Description: "Functional testing" is not a good term - should be renamed (to "other"?). Goal is to cover performance, usability, implementation-dependant features that are not covered by the spec (which is conformance testing)
Proposal:
Resolution:  
003 Jan 04 Substantive Unassigned QAWG TBD
Section: Concepts - Types of testing
Title: Expand discussion of interoperability testing and compare with conformance testing.
Description: Interoperability is possible without conformance, but it needs prior agreement, and it doesn't scale in numbers of implementations, or with time.
Proposal: No - don't lose focus. We've already said that we are addressing conformance test materials. Let's not confuse the issue by introducing interop testing concepts. Issue 004 talks about applying the conf. testing principles to other areas. Enough said.
Resolution:  
004 Jan 04 Substantive Unassigned QAWG TBD
Section: Concepts - Types of testing
Title: Emphasize usefulness of these guidelines for types of testing other than Conformance
Description: Note that many forms of testing can be performed by comparing actual behaviour with behaviour defined in a "specification". Conformance
testing is only such kind. Consequently, even though we will focus on conformance testing, much of what we say is applicable to other kinds of testing.
Proposal:
Resolution:  
005 Jan 04 Substantive Unassigned QAWG TBD
Section: Concepts - Types of testing
Title: Focus of the guidelines
Description: Should we focus on conformance testing or also cover other kinds of testing? (After all, the group's name is "QA" not "Conformance".)
Proposal: State that we will focus on conformance testing (for this revision of the doc).
Resolution:  
006 Jan 04 Substantive Unassigned QAWG TBD
Section: Concepts
Title: Don't mandate a 'waterfall model' of development
Description: Discuss alternative test-development strategies. Some examples:
  • help the user detect if the implementation is conformant or not
  • help the implementation developer improve his product
  • help the specification writer improve his specification

[MS] I don't see where these have anything to do with whether or not a "waterfall model" was used. [PC response] the first bullet in the list below does (test first, develop spec later)...

[PC] need a better definition of strategies for development. For example:

  • to clarify whether a proposed feature in spec is implementable
  • to verify whether implementations of spec are conformant
  • to verify whether implementations of spec are interoperable

We must stress that we don't necessarily require a "waterfall" process, and that the requirements described in the GL can be applied on a recursive basis; we should actually encourage this type of strategy (without requiring it) in this section.

[MS] I'd like to add the use case that good QA (and this guideline) will result in good test materials that will improve Q/A and thus, result in less bugs, lower maintenance costs, etc. In other words, this use case illustrates voluntary use of tests as a carrot, rather than the stick as in certification, to improve software and save money.

[PC] This seems more like a general rationale for following these guidelines rather than a use-case. We should certainly incorporate this somewhere.

Proposal: We need a short, concise way of defining the "waterfall model". Once we've done that, all we need to say is that this guideline does not imply any one type of model. (We also need to ensure that we do not imply a waterfall model in the wording of guidelines and checkpoints.)

References:

  • http://asd-www.larc.nasa.gov/barkstrom/public/The_Standard_Waterfall_Model_For_Systems_Development.htm
  • http://www.ctg.albany.edu/publications/reports/survey_of_sysdev?chapter=5
  • http://www.convergsoft.com/contents/methodology_sdlc.htm
Resolution:  
007 Jan 04 Substantive Unassigned QAWG TBD
Section: Use Cases
Title: Need use-cases
Description: Suggested use-cases:
  • testing lab/certification authority needs to test products; the spec and the products already exist, only needs to check if the products conform
  • implementations are begun before spec is finished; tests are needed to check if they conform
  • WG needs feedback on its specification, and uses test cases as a way to get this feedback
  • WG uses test cases as a way to explore new features (eg in OWL, SVG, CSS)
  • comparisons of actual state of implementations independantly of conformance [example of interop testing]
Proposal:
Resolution:  
008 Jan 04 Substantive Unassigned QAWG TBD
Section: Guideline 2 - Test Assertions
Title: Test Assertions
Description: Despite overlap with specGL, we agreed that it's important to discuss assertions in Test GL. In specGL, TA are considered as output, while in TestGL, they are input.

Reviewing the TA definition in specGL, it appears we also want to move some of the verbiage of SpecGL GL10 to the definition in specGL glossary
and QA-Glossary.

The assertions section of TestGL should point to the definition, address why they are important, how they can be extracted/derived automatically
or not, address what makes a good/useful TA. Needs to emphasize again that there might be a feedback loop between the test assertions extraction and the spec development.

Checkpoint 2.1: Rewrite checkpoint to: Provide a list of test assertions. Change to priority 1. Add some verbiage to the discussion to include examples of derived assertions.

Checkpoint 2.2. The discussion for this checkpoint should include more detail including the possibility of extracting more than one assertion from the same quoted text. It is up to the test developer to decide the mechanism for assigning a unique assertion-ID. The metadata in this checkpoint should be tied with the metadata in checkpoint 3.2.

Replace location for the text from which the assertion was derived.

[PC] we've discussed assertions several times, and don't seem to be able to agree on whether a specific assertion list is required, or whether
it's sufficient to define the "test purpose" in metadata (explaining that assertions are a really good way to do this). Will re-open this as an email discussion, referencing relevant SpecGL Issues.

Proposal:
Resolution:  
009 Jan 04 Substantive Unassigned QAWG TBD
Section: Introduction - Class of Product
Title: Specify Class of Product
Description: We must specify the types of test materials to which these guidelines apply.

Proposal: Point to the class of product listed in SpecGl rather than duplicate the list here.

[MS] points out that the COP for SpecGL is "specification", in particular W3C Technical Reports.

[PC] We must be referring to Section 2.2.of SpecGL ("Specification category and class of product") where we do classify different types of spec. (Should that section header really include the term "class of product"?)

Resolution:  
010 Jan 04 Substantive Unassigned QAWG TBD
Section: Definitions
Title: Need a definition of "testable".
Description: [PC] We had an extensive email thread on this. Did we ever resolve?

Proposal: [MS] and [PC] believe that it's unnecessary to formally define this term. The intuitive understanding most readers will have should suffice.

Resolution:  
011 Jan 04 Substantive Unassigned QAWG TBD
Section: Checkpoint 1.1 - Define the test suite scope
Title: Move content from rationale to discussion
Description: Keep the rationale concise. Move examples and other material to the discussion section. Remove the example cited in the rationale and add the HTTP (client only, server only) example provided by Alex Rousskov.

[PC] Alex will need to document this - we have no minutes.

Proposal:
Resolution:  
012 Jan 04 Editorial Unassigned QAWG TBD
Section: Checkpoint 1.2 - Identify the specifications to be tested
Title: Use the term "explicitly tests other specifications" in the rationale.
Description:
Proposal:
Resolution:  
013 Jan 04 Substantive Unassigned QAWG TBD
Section: Checkpoint 1.3 - Define a testing approach
Title: Define the term "testing approach"
Description: Define the term "testing approach" (refers to what kind of test is going to be developed) and provide examples (Validators, API testing, Protocol). Also, include partitioning as part of the discussion section.
Proposal:
Resolution:  
014 Jan 04 Editorial Closed QAWG Patrick Curran
Section: Checkpoint 1.3 - Define a testing approach
Title: Re-word conformance requirement
Description: Rewrite the conformance requirement to: "A testing approach must be identified as a result of the specification analysis".
Proposal: Close. [PC and MS] believe that the propsed wording unduly constrains the testing approach.
Resolution: CLOSED - no action
015 Jan 04 Substantive Unassigned QAWG TBD
Section: Guideline 3 - Define the process for managing test materials
Title: Guideline 3 addresses WG process rather than test materials
Description: Guidelines must address test materials, not WG processes.
Proposal: Change to "Support test material metadata".
Resolution:  
016 Jan 04 Substantive Closed QAWG Patrick Curran
Section: Checkpoint 3.1 - Define the process for managing test materials
Title: Drop Checkpoint 3.1 (it addresses processes)
Description: See Issue 15
Proposal: Drop this checkpoint
Resolution: DONE
017 Jan 04 Substantive Unassigned QAWG TBD
Section: Checkpoint 3.2 - "Define the metadata to be associated with test materials
Title: Clarify Checkpoint 3.2
Description: Include a statement clarifying that test materials are not referring to testcases.

[PC] I don't understand this statement. Metadata is associated with testcases.

Proposal:
Resolution:  
018 Jan 04 Substantive Unassigned QAWG TBD
Section: Checkpoint 3.2 - Define the metadata to be associated with test materials
Title: Move some of the requirements and discussions back to Checkpoint 2.2 - Tag assertions with essential metadata
Description:

In the conformance requirement the third bullet should go back to assertions. The fifth bullet is too ambiguous, it must be clarified.

In the discussion section:

  • The third bullet should go back to assertion as an optional metadata
  • Fourth bullet should also go back to assertions but as mandated
  • Fifth bullet should go back to assertion but rewording conformance level to degree of conformance
  • A new bullet should be added to include conditional tests as an optional metadata.

[PC] see Issue #008

[MS] The distinction between tagging assertions with metadata and associating metadata with test materials is confusing to me. Since assertions ultimately result in test materials why do we need the metadata in two places? Wouldn’t it be simpler (and less confusing) if we just associated the metadata with either the assertions or the test materials, but not both?

[PC] There may be many test-cases associated with a single (complex) assertion. Under these circumstances, attributes like "optional", or "specific to version x.y of the spec" belong with the assertion (so that they aren't duplicated). On the other hand, attributes like "input arguments/data for test" and "expected results of test" clearly must be associated with tests. Continue to discuss this in the context of our ongoing discussion of assertions.

Proposal:

Resolution:  
019 Jan 04 Substantive Unassigned QAWG TBD
Section: Checkpoint 3.3 - Provide coverage information
Title: Modify conformance requirement of Checkpoint 3.3
Description: Modify conformance requirement to require publishing the list to publish the list as well as the percentage.

[PC] The assertion list? This is covered elsewhere.

[MS] This requirement should also be modified to say “at a minimum . . . “must” be, rather than “should”. Since this is a minimum list, it’s an absolute requirement, thus “must”.

Proposal:
Resolution:  
020 Jan 04 Substantive Closed QAWG Patrick Curran
Section: Checkpoint 3.4 - Provide an issue-tracking system
Title: Drop Checkpoint 3.4

Description: This addresses process - inappropriate for TestGL.

[PC] have we captured this "good practice" somewhere in OpsGL?

Proposal:
Resolution: DONE
021 Jan 04 Substantive Unassigned QAWG TBD
Section: Checkpoint 3.5 - Automate the test materials management process
Title: Checkpoint 3.5 inappropriately addresses process
Description: Once again, we're talking about process (belongs in OpsGL, not here). Reword to focus on metadata and on mechanisms for filtering, sorting, manipulating it.
Proposal:
Resolution:  
022 Jan 04 Editorial Unassigned QAWG TBD
Section: Checkpoint 4.1 - Define the test execution process
Title: Suggested change for conformance requirement of Checkpoint 4.1
Description: Rewrite conformance requirement to: "The process for executing tests must be well defined and must document how to execute the test".

[PC] how is this better than what we currently have ("The process for executing tests must be well defined and documented")? Note also that "well defined" is untestable. The essence of this checkpoint is: "tell me how to execute the tests. Do so unambiguously, so that if someone else follows your instructions they will get the same results as I do". (See Issue #023.)

Proposal:
Resolution:  
023 Jan 04 Substantive Unassigned QAWG TBD
Section: Guideline 4 0 - Define the process for executing tests
Title: Add a checkpoint requiring that test results be reproducible and repeatable
Description: In discussion we realized that the only testable requirement would be to require specification of whether test (results) are reproducible and repeatible. This is pretty weak, but what else is testable?

Alternate suggestion (preferred after discussion): materials must document where test results are not expected to be reproducible and repeatable, and to explain why. If this info is specific to a particular test (as opposed to a group of tests or even the entire test suite) it should be contained in the test
metadata.

Include in the rationale discussion of the order in which tests must be run (if indeed order is important). We must also ensure that every test that should be run is run, and that those that should be excluded from the test run are excluded.

Proposal:
Resolution:  
024 Jan 04 Substantive Unassigned QAWG TBD
Section: QAWG Glossary (and/or Definitions section of this doc?)
Title: Define "reproducible" and "repeatable"

Description: Meeting minutes said define in "glossary". Also in definitions section?

Proposal: Definitions from [LR]:

The real definitions are from ISO 5725-1:1994/Technical Corrigendum 1, Published 1998-02-15, "Accuracy (trueness and precision) of measurement methods and results Part 1: General principles and definitions" and ISO 5725-2:1994, "Accuracy (trueness and precision) of measurement methods and results Part 2: Basic method for the determination of repeatability and reproducibility of a standard measurement method."

"3.13 repeatability: Precision under repeatability conditions.

3.14 repeatability conditions: Conditions where independent test results are obtained with the same method on identical test items in the same laboratory by the same operator using the same equipment within short intervals of time.

3.17 reproducibility: Precision under reproducibility conditions.

3.18 reproducibility conditions: Conditions where test results are obtained with the same method on identical test items in different laboratories with different operators using different equipment."

Resolution:  
025 Jan 04 Substantive Closed QAWG TBD
Section: Guideline 4 0 - Define the process for executing tests
Title: Add a checkpoint requiring that test results be reproducible and repeatable
Description: Rolled into issue #023
Proposal:
Resolution:  
026 Jan 04 Substantive Unassigned QAWG TBD
Section:
Title:
Description: Checkpoint 4.2 ("Automate the test execution process"). Essential requirements are: 1) test execution must be automated, 2) automation
must be platform-independent. Not always necessary or possible to provide cross-platform framework. Platform-independence should be removed from requirements and instead stated as a goal in discussion.

[PC] what's the difference between "platform-independent" and "cross-platform"? (Are we really trying to say "don't write to a particular platform, but don't feel obliged to write for all platforms"?)

Proposal:
Resolution:  
027 Jan 04 Substantive Unassigned QAWG TBD
Section: Checkpoint 4.3 - Integrate results reporting into the automated test execution process
Title: Do not mandate use of any supplied test harness
Description: Results reporting is the aggregation of the results. The more you expect the harness to do the more issues you have with implementers ability to use the harness, since they may want to use their own for their own reasons. So, require that there be a test harness, but allow for people to substitute their own reporting mechanism or harness. Should there be instructions for using the test suite (with test harness) indicate that the harness is not required for making a conformance claim. Important issue, should be captured, but out of scope of this CP. Put in generic discussion for Guideline 4 keeping it general, since this has broader applicability.

[PC] I'm not sure what the last two sentences mean.

Proposal:
Resolution:  
028 Jan 04 Editorial Unassigned Dimitris Dimitriadis TBD
Section: Checkpoint 5.1 - Review the test materials
Title: Suggested wording for description of Checkpoint 5.1

Description: Ideally, this means that the WG (or whatever body will in the end endorse the use of the test suite) approves the tests in the test suite
as well as test generation mechanisms (if applicable).

[PC] Once again, we're straying into OpsGL territory by addressing process. Can we sidestep this by requiring that test metadata include the results of reviewing? [MS] agrees.

[DD] Agreed, and I propose to reword to "make tests reviewable" (implies coming up with a scheme for reviewing, which is not process, but tech details.

[PC] Still sounds like process to me. We must focus on the output (results of review) and not on how that output is produced (the "scheme for reviewing").

Proposal:
Resolution:  
029 Jan 04 Editorial Closed Dimitris Dimitriadis TBD
Section: Checkpoint 5.1 - Review the test materials
Title: Suggested wording for description of Checkpoint 5.1
Description: Rolled into issue #28
Proposal:
Resolution:  
030 Jan 04 Substantive Unassigned Dimitris Dimitriadis TBD
Section: Checkpoint 5.3 - Package the test materials into a test suite
Title: Clarify components of test suite

Description: Need to spell out the parts that make up a test suite, as well as what parts are optional. Also need defintions of the terms used (for example "test harness", "test case").

[MS] This sounds like process to me. I’m not sure it should be a ckpt.

[DD] I think this is as much process as providing test materials to begin with, that is, not much. Packaging is maybe an unsuccessfully picked word, but it boils down to someone having to delimit relevant tests (used to test conformance) from irrelevant ones.

[PC] It's easy to reword the checkpoint to focus on the output (the package) rather than the process (the packaging). The essence of the checkpoint is that a website containing a bunch of links, some of which point to tests, other to docs, others to 'metadata' does not constitute a test suite. Until everything is packaged together test execution runs are unlikely to be reproducible and repeatable. (See also new issue 049 below.)

Proposal:
Resolution:  
031 Jan 04 Substantive Unassigned Dimitris Dimitriadis TBD
Section: Checkpoint 5.4 - Test the test suite
Title: Suggested description for Checkpoint 5.4 - Test the test suite

Description: This needs to be either stressed as either the final proof that this is the real thing (being endorsed by the WG or similar body) or alternatively as one more step in making sure the test suite is designed to be of as high quality as possible (without making reference to it being officially approved).

[PC] Again, this checkpoint addresses process (the WG's activities) rather than the test materials. Require that a test plan (and/or test results) be published along with the test suite?

[MS] Also, as this ckpt now stands, it cannot be tested. (How ironic for a ckpt to test the test suite). The ckpt should require a test plan and test results, as PC suggests.

[DD] Again, Ops since it requires something from the WG, but essential in order to be able to use the test materials for conformance. Include it in the workflow of providing test materials for it to be testable (as a checkpoint). I stress the need for it to be clearly delimited as it is THE activity (I think) that discriminates"officially accepted" test materials from non-accepted ones.

Proposal:
Resolution:  
032 Jan 04 Substantive Unassigned Dimitris Dimitriadis TBD
Section: Checkpoint 5.4 - Test the test suite
Title: Should have separate checkpoints for test materials and test suite

Description: I still think we should have two separate checkpoints for test materials and test suite, respectively, as these are two separate things and quite
different from one another (for example, tests are related to specifications, test suites are not).

[PC] Two separate checkpoints for doing what? Testing the test materials and the test suite?

Proposal:
Resolution:  
033 Jan 04 Substantive Unassigned QAWG TBD
Section: Checkpoint 5.5 - Solicit feedback on the test materials
Title: Describe how feedback on test materials could be provided

Description: Describe the possible ways in which feedback is given: mailing list for example requires readers that can act on it. Emphasize that feedback should be acted upon.

[PC] Another process-related checkpoint. Relatively easy to reword as a requirement to define and publish a feedback mechanism.

[MS] The important point is not to describe ways feedback can be given, but to obtain feedback, somehow, some way. The feedback mechanism is not important, just the results. Can we just require that feedback solicitation and feedback results be published? [DD] agrees

Proposal:
Resolution:  
034 Jan 04 Substantive Unassigned Dimitris Dimitriadis TBD
Section: Guideline 6 - Define the process for reporting test results
Title: Request clarification of Guideline 6

Description: Are we here speaking of the automated result reporting in the test suite, or of something separate? Since we indicate automation in the second paragraph of the introduction, we may want to specify that.

[MS] Again, this is a process requirement. Does it belong here?

[PC] Shouldn't have used the "process" word. The blurb under the guideline and the discussion text for checkpoints 6.1 and 6.2 make it clear that we're talking about the mechanisms that the tests use to report their results to the person executing them. If this isn't done in a clear, consistent, and understandable manner it will not be possible to accurately determine which tests have passed and which tests have failed, once again defeating the goal that test runs be reproducible and repeatable.

Proposal:
Resolution:  
035 Jan 04 Editorial Unassigned Dimitris Dimitriadis TBD
Section: Checkpoint 6.2 - Tests should report diagnostic information
Title: Suggested wording change for Checkpoint 6.2
Description: Provide diagnostic information where applicable (some implementations may not, for example, implement error reporting).
Proposal:
Resolution:  
036 Jan 04 Substantive Unassigned Dimitris Dimitriadis TBD
Section: Checkpoint 6.3 - Define and document the process for reporting test results
Title: Comment on Checkpoint 6.3

Description: Again, if automated, this relates more to the technical issues about the framework than the process as such (we want to allow for different
processes to lead to the same result, namely uniform results reporting).

[PC] whether or not we have automation we must still define and document a process for reporting results.

[MS] Again, this is process.

[PC] We can get around this in the same way as with 'review the test results' or 'gather feedback'. Focus on the output and not the process.

[DD, commenting earlier, but really addressing this point] Propose to change wording to "in the absence of existing mechanisms for reporting test results, create one and package together with test suite"

Proposal:
Resolution:  
037 Jan 04 Substantive Unassigned Dimitris Dimitriadis TBD
Section: Checkpoint 6.4 - Allow test results to be filtered
Title: Clarify wording of test results filtering in Checkpoint 3.4
Description: Perhaps we should change wording to indicate that filtering be made in accordance with specification modules, for example. I agree that filtering is more relevant in building or execution phases than after having been run.
Proposal:
Resolution:  
038 Jan 04 Substantive Unassigned Dimitris Dimitriadis TBD
Section: Checkpoint 6.5 - Automate the results reporting system
Title: Suggested consolidation of discussion of results reporting
Description: If we stress automation of results reporting, this should be discussed in one checkpoint (relates to issues #034 and #036 above).
Proposal:
Resolution:  
039 Jan 04 Substantive Unassigned Dimitris Dimitriadis TBD
Section: Appendices
Title: Need explicit appendices
Description: Need to be made explicit.
Proposal:
Resolution:  
040 Jan 04 Editorial Unassigned QAWG TBD
Section: Checkpoint 5.1 - Review the test materials
Title: Wording changes for Checkpoint 5.1
Description: Move first sentence of ConfReq to the discussion. Object of this CP is test material management system. Both modules (objects) are required for conformance. The review is that this has been in use for several years and being used that would be a valid review. Review all test materials.
Add ‘all’ to all applicable places need to review use of ‘all’ in entire document.
Proposal:
Resolution:  
041 Jan 04 Substantive Unassigned QAWG TBD
Section: Checkpoint 5.2 - Document the test materials
Title: Checkpoint 5.2 is redundant

Description:

[PC] Why?

Proposal:
Resolution:  
042 Jan 04 Substantive Unassigned QAWG TBD
Section: Checkpoint 5.3- Package the test materials into a test suite
Title: Checkpoint 5.3 is not testable

Description: Test Suite is all the pieces of materials needed, wrapped up together. As written not testable. Make a minimal list of what MUST be provided, including: user documentation, IPR, test harness if supplied, referenced output if defined. Need to make sure that ‘test suite’ is understood. Test suite is the package (sum) of all the components needed to test an implementation. Test materials are the components that make up the test suite. (These terms should be defined.)

Proposal:
Resolution:  
043 Jan 04 Substantive Unassigned QAWG TBD
Section: Checkpoint 5.4 - Test the test suite
Title: Suggested clarification of objects of TestGL

Description: Management system is the object. David Marston to help define the 2 objects of TestGL. In discussion, include the frequency of applying the test plan and testing.

[PC] ???

Proposal:
Resolution:  
044 Jan 04 Substantive Unassigned QAWG TBD
Section: Checkpoint 6.1 - Tests should report their status in a consistent manner
Title: Clarify meanings of test-result states

Description: Need to add definitions for the terms. Reword. Remove Cannot Tell. In discussion, reference that terms came from EARL. Can you map these to other states or must you use these ‘states’? If these apply, then MUST use them. These are states, we have definitions of the states, definitions are normative, not providing labels for the states, if state applies, use it. Recommend that if English, use these labels. Change status to outcome.

[MS] This ckpt, and others in Guideline 6 use “should” in describing the ckpt. The “should” should (or must - I’m going insane) be changed to “must”. Checkpoints should not leave any wiggle room. A better way to describe the ckpt is with an active verb (e.g., Report test status, Report diagnostic info.)

Proposal:
Resolution:  
045 Jan 04 Editorial Unassigned QAWG TBD
Section: Section 6.2 - Tests should report diagnostic information
Title: Simplify conformance requirement for checkpoint 6.2
Description: Simply "must provide diagnostic information". Rmainder of sentence is rationale.
Proposal:
Resolution:  
046 Jan 04 Editorial Unassigned QAWG TBD
Section: Checkpoint 6.3 - Define and document the process for reporting test results
Title: Checkpoint 6.3 already exists in OpsGL
Description: Rewrite as "Define an interface to allow publishing of results".
Proposal:
Resolution:  
047 Jan 04 Substantive Unassigned QAWG TBD
Section: Checkpoint - 6.4 Allow test results to be filtered
Title: Reword checkpoint 6.4
Description: "Have a results management system"
Proposal:
Resolution:  
048 Jan 04 Substantive Unassigned QAWG TBD
Section: Checkpoint 6.5 - Automate the results reporting system
Title: Reword checkpoint 6.5

Description: "automate the system"

[PC] ???

Proposal:
Resolution:  
049 Feb 5, 2004 Substantive Unassigned Patrick Curran TBD
Section: Guideline 3
Title: Need a new checkpoint: releases of test suites must be versioned

Description: Test suites must be explicitly released (rather than "dribbled out" by constantly updating a website), and must be versioned. If they're not, test runs cannot be deterministic or repeatable and the results of different test runs cannot be compared.

Proposal:
Resolution:  

Table Legend

Num
Issue number
Title
Short title/name of the issue
Description
Short description of issue, possibly including link to origin of issue
Date
The date at which the issue was raised or initially logged.
Class
Substantive or Editorial
Status
One of: Unassigned, Active, Closed, Postponed
Section
Section of the document this issue applies to
Proposal
Current proposal for resolution of issue, possibly including link to further text
Resolution
Short description of resolution, possibly including link to a more elaborate description
Raised by
Person who raised the issue
Owner
QA WG Member responsible for the issue