Re: New Possible TestGL outline

I, too, think this new organization is great!  Many thanks.

On Thursday, we discussed references that seemed to be for a "test strategy 
document" and discussed whether TestGL should mandate such a document 
and/or a "test design document".  It sounds like, in the new organization, 
G1 could serve as the requirements for a "test strategy document" and G2 
(and maybe some parts of G3) as requirements for a "test design" 
document.  I think that "determine level (depth/breadth) and priority of 
coverage" sound more like test strategy and should move from G2 up to G1, 
perhaps even be the very first checkpoint.

Also, do we want to require (a) separate document(s) addressing test 
strategy and/or test design?  If not, where are the documentary 
requirements captured?

Mark





  At 01:53 PM 3/9/2003 -0500, Lynne Rosenthal wrote:

>Excellent.  The new organization and structure is clear, to the point, and 
>addresses all my concerns.  Many thanks to Peter and Patrick.
>
>lynne
>
>
>At 10:31 AM 3/9/2003 -0500, pfawcett wrote:
>
>>Howdy folks,
>>
>>This is a brief break down of the new structure/outline for TestGL that 
>>Patrick and I
>>came up with after fridays meeting. We both think that this has a 
>>significantly better
>>flow to it, addressing some of Lynne's major general concerns. We also 
>>tried to steer
>>away from 'loaded language' like methodology, test framework and a few 
>>other terms we
>>found in the document that caused difficulty as they meant different 
>>things to different
>>folks. We also tried to condense things down a bit. At the beginning we 
>>thought we would
>>be getting rid of a guideline but instead we ended up creating a new one. 
>>But the checkpoints
>>are down to a reasonable 2-4 per guideline rather than up to 14 in some.
>>Finally we made an effort to not duplicate effort (and thus potentially 
>>get out of sync) with
>>other documents. Primarily this concerns the interaction of TestGL and 
>>OpsGL. Ops has a
>>number of checkpoints concerning how test materials are contributed, what 
>>criteria
>>must be followed and so on. So we made an effort not to re-include that here.
>>Many of these do not have priorities assigned yet either.
>>Finally we tried to steer clear of the "this is the right way to write a 
>>test suite" and
>>instead focus on the "this is what a good test suite should be composed of".
>>Lynne's new checkpoint for Issue 107 (that's just been posted would fit 
>>nicely in to
>>the new G1 or G2.
>>If nothing else it can server as a basis for discussion on Monday, is 
>>this a better organization than before
>>and are we missing any thing.
>>
>>Thanks,
>>Peter
>>
>>Outline of new guidelines:
>>
>>G1 - high level functional analysis of spec to determine strategy of test 
>>development. (was G2-G3)
>>         - combine 2.1 and 2.2 in to one checkpoint
>>                 analyze the specification and determine how to structure 
>> test materials.
>>                 determine what testing areas the specification is 
>> composed of.
>>         - 3.1 determine how to cover each area. Is only one approach 
>> going to be used or will
>>                 there be more than one.
>>         - 1.10 develop user scenarios for the specification.
>>         (move 3.2 to ex tec or some such or as descriptive text.)
>>
>>G2 - deep analysis the spec and extract what needs to be tested and how 
>>(was G1)
>>         - extract assertions/normative language and tag
>>                 according to category - Using categories provided by patrick
>>                 In other words rather than having explicit checkpoints 
>> for each required,
>>                 optional, depreciated, discretionary, under-defined, etc 
>> asserts,
>>                 have a checkpoint that has all asserts or normative 
>> language extracted and
>>                 then grouped by category. It's the same basic idea but 
>> it takes 4 checkpoints into one.
>>         - determine level of coverage (depth/breadth) and priority of 
>> coverage (what's
>>                 most important).
>>
>>G3 - test management system (was part of G4)
>>         - Have a test management system
>>         - the system must support meta data like documentation, 
>> pass/fail criteria,
>>                 coverage level, state of test, association back to asserts,
>>                 and dimensions of variability.
>>
>>G4 - test suite/cases development (was G6)
>>         - prototype test framework (6.2)
>>         (ops deals with submission and criteria for acceptable submissions)
>>
>>G5 - test execution. (was part of G4)
>>         - The metadata from the management system must provide 
>> sufficient information
>>         to allow tests to be executed in a predicable and consistent way.
>>         - automation is encouraged (cross platform stuff goes to ExTec.)
>>         - system must allow for tests to be filtered based on metadata 
>> criteria. (This
>>                 is where the DOV really enters in for a test suite 
>> rather than in the analysis part.
>>                 In the analysis you want to identify them but here is 
>> where you really care)
>>         - the test execution process should save output from tests for 
>> analysis (pass/fail,
>>                 what cases failed, and logs if relevant).
>>
>>G6 - result reporting (G5)
>>         - must support result reporting (5.1)
>>         - it should create reports as a unified package (for example 
>> like a web page (5.3))
>>         - It must indicate what passed and failed.
>>         - It should be automated if possible
>>         - It should supported filtering and comparison of results
>>
>>G7 - conformance testing. (G7)
>>         - encourage vendors to use the test suite. (p1)
>>         - encourage vendors to publish results (p3)
>

Received on Sunday, 9 March 2003 16:28:58 UTC