AI-20030317-8

Ok, here is my second action item that I had for today.
I included the full text of each comment followed by a PF - ... comment 
which is by off the cuff
response to each one. These comments aren't ready to send back till we 
discuss them (on
Monday hopefully) but it's a good starting point. A number of the first 
comments relate to
the introduction which was determined to need a re-write in the last 
F2F at Boston. The
comments that the XML Schema folks made were quite consistent with 
Lynne's and
Loften's comments that we processed at the F2F.

Other consistent comments related to overlap between OpsGL and TestGL, 
another
issue that was discussed both by the WG and by Patrick and my self when 
doing the new
outline draft.
Take a look at my comments and see if you agree.
Thanks,
Peter


QA Framework: Test Guidelines
W3C Working Draft 20 December 2002
http://www.w3.org/TR/2002/WD-qaframe-test-20021220

Note TGL-LA1.
	Context
		Status of this document , 4th and 6th paragraphs "A future version
	of this document will be accompanied by a "Specification Examples &
	Techniques" document"
	
	Comment
		Wrong document name: "Specification Examples & Techniques" . This
	is also direct duplication with 4th paragraph: "This part of the
	Framework document family will eventually have an informative
	accompanying QA Framework: Test Examples and Techniques document"
	
	Proposal
		Drop 4th paragraph and fix 6th paragraph to use "Test Examples &
	Techniques"

PF - Yes. Agree with proposal, this is part of the Intro re-write.


Note TGL-JT1.
	Context
		General
	
	Comment
		The style of the language is very poor: the document has the
	appearance of having been written hurriedly and never reviewed
	(perhaps not what we would expect from the QA group!). In some cases,
	the language is so garbled as to render sections ambiguous. Recommend
	extensive editorial review and enhancement. In particular: Checkpoint
	3.1, final 2 paragraphs; Checkpoint 4.2; Checkpoint 4.3.
	
	Proposal
		Improve language.

PF - Yes, Agree. This was discussed at the last F2F. Loften and Lynne 
both specifically mentioned
some of the GL4 checkpoints.


Note TGL-LA2.
	Context
		1. Introduction
	
	Comment
		The Introduction doesn't follow recommendations of SpecGL
	(checkpoints SpecGL 1.1, 1.2,2.1) and doesn't describe the scope of
	the specification and classes of product.
	
	Proposal
		Follow the structure of introduction used in SpecGL and OperGL and
	clearly describe the scope and class of products.

PF - Yes, Agree. This was specifically discussed at last F2F. Intro 
needs to be
re-written to be in accordance with the other documents.


Note TGL-JT2.
	Context
		1. Introduction
	
	Comment
		The terms "conformance" and "compliance" are interchanged in an
	apparently haphazard fashion. The term "compliance" occurs nowhere
	else in the Framework.
	
	Proposal
		Replace all occurrences of "compliance" with "conformance".

PF - Yes, Agree. We do conformance testing not compliance testing. This 
was
specifically mentioned at the last F2F.


Note TGL-JT3.
	Context
		1.1 Motivation for this guidelines document. 1st bullet.
	
	Comment
		First item in bulleted list: it is not realistic to assume that it
	is possible to eliminate the possibility of misinterpretation of a
	specification.
	
	Proposal
		Substitute "minimized" for "eliminated".

PF - Yes. This is likely true. 'Eliminated' is a nice goal but not 
realistic.


Note TGL-LA3.
	Context
		1.1 Motivation for this guidelines document. 2nd bullet.
	"Developing an open conformance test suite for a specification,
	applicable by any implementation. "
	
	Comment
		It's not clear what "open" means here. Availability and licensing
	policy is in the scope of Operation Guidelines and should be addressed
	there.
	
	Proposal
		Drop the word "open" here and say something about "openness" in
	Operation Guidelines.

PF - Yes, there is an attempt to not repeat material from OpsGL. This 
will either
be reworded or removed for the next re-write of the introduction.


Note TGL-LA4.
	Context
		1.1 Motivation for this guidelines document. 3rd bullet. "Ensuring
	the quality of the particular implementation by testing against
	specifications and conducting interoperability testing with other
	available implementations."
	
	Comment
		There are multiple issues in this sentence. First, a quality of an
	implementation is much wider notion that just conformance. Along with
	conformance criteria it may also include performance, reliability,
	interoperability and other criteria. Also interoperability is a
	different notion that conformance to the specification. For example,
	two implementations can be interoperable but non-conformant. And two
	conferment implementations can be non-interoperable.
	
	Proposal
		Change to: "Improving the quality of the particular implementation
	by testing against specifications and conducting interoperability
	testing with other available implementations."

PF - Yes, It was decided in the last F2F that this needed to be 
re-written
as part of the Intro re-write.


Note TGL-LA5.
	Context
		1.1 Motivation for this guidelines document. "A free-to-use
	conformance test suite that covers most if not all of the
	specification requirements, is developed by interested parties across
	industry, and is applicable to any of the specification's
	implementations, provides:"
	
	Comment
		A freedom to use and a parties involved in the development are in
	the scope of Operation Guidelines. These are not really relevant for
	the subsequent logic.
	
	Proposal
		To change to: "A conformance test suite that covers most if not
	all of the specification requirements, and is applicable to any of the
	specification's implementations, provides:"

PF - Yes, It was decided in the last F2F that this needed to be 
re-written
as part of the Intro re-write and to not duplicate or over lap with 
OpsGL.


Note TGL-LA6.
	Context
		1.2 Navigating through this document
	
	Comment
		What Guideline covers test quality and especially portability?
	
	Proposal
		Add guidelines ensuring test quality and portability between
	implementations.

PF - This is a good question and it raises one of the 'big' issues with
the proposed new GL4. What can we say about "test quality" that is
verifiable, meaningful, and doesn't restrict test suite developers.
If portability is important we may want to not completely remove
all of the language about platform independence but instead we may want
to come up with one checkpoint that is general about the importance of
portability and not being system dependent.


Note TGL-LA7.
	Context
		Checkpoint 1.1. Identify the target set of specifications that are
	being tested "[EX-TECH] For example, XML test suite [@@Link] may not
	include tests that specifically test the URN format, but XSLT [@@Link]
	and XQuery [@@Link] test suites will include many tests for XPath
	functions."
	
	Comment
		Is this URN or URI?
	
	Proposal
		Change to URI?

PF - Yes. I think that's right. Change to URI.


Note TGL-LA8.
	Context
		Checkpoint 1.5. Identify all the discretionary choices defined in
	the specification [Priority 1]
	
	Comment
		Is this really Priority 1? Other similar checkpoints have priority 2.
	
	Proposal
		Change priority to 2?

PF - This is now part of the test management system as part of the 
required
meta data/information that needs to be stored in the system. Rather than
imposing a structure on the test suite, allow for the test suite to be
filtered by different criteria including structure.


Note TGL-JT4.
	Context
		Section 1.5.
	
	Comment
		Final definition, "Results Verification": presumably the intention
	is to determine whether an implementation passes or fails a test, not
	"if a test passes or fails". See also Checkpoint 4.11.
	
	Proposal
		Clarification is needed.

PF - Yes, Wording needs work. This will be fixed with the re-write of 
the glossary.


Note TGL-LA9.
	Context
		Checkpoint 2.2. Provide mapping between the test suite structure
	and the specification structure. [Priority 1]
	
	Comment
		All test coverage related items are very important and should have
	the same priority.
	
	Proposal
		Change priority to 2?

PF - This is now part of the test management system as part of the 
required
meta data/information that needs to be stored in the system. Rather than
imposing a structure on the test suite, allow for the test suite to be
filtered by different criteria including structure.


Note TGL-LA10.
	Context
		Checkpoint 3.2. Identify publicly available testing techniques.
	List the publicly available testing techniques that have been reused
	[Priority 1]
	
	Comment
		Sharing testing techniques is important for test development
	performance but is not probably P1.
	
	Proposal
		Change priority to 2 or 3?

PF - Not a checkpoint anymore. Now related to ExTech or descriptive 
language.
these are good things to do as one is developing a test framework or
test management system but they are not really testable. It's more along
the lines of best practices.


Note TGL-LA11.
	Context
		Checkpoint 4.1. List available test frameworks and applicable
	automation. Identify available test frameworks used. If none, justify
	why new frameworks are needed and existing ones could not be used.
	[Priority 1]
	
	Comment
		This is again related only to performance and efficiency of test
	development so can be of lower priority. Besides researching all
	frameworks available in the world could take a lot of unnecessary
	efforts. Besides adaptation efforts should be taken into account. Also
	it would be good to choose such a test framework that could be used
	with different test execution automation systems
	
	Proposal
		Change priority to 2 or 3? Limit frameworks to already used in
	W3C?

PF - Not a checkpoint anymore. Now related to ExTech or descriptive 
language.
these are good things to do as one is developing a test framework or
test management system but they are not really testable. It's more along
the lines of best practices.


Note TGL-LA12.
	Context
		Checkpoint 4.2. Ensure the framework and automation are platform
	independent. Demonstrate on 3 platforms. Ensure that the framework and
	automation are built using open standards. [Priority 1]
	
	Comment
		These are maybe two separate checkpoints. Second one about open is
	not very clear and need clarifications. What those "open standard"
	means? For example is perl an open standard? python? tcl?
	
	Proposal
		Either drop or clarify the second sentence.

PF - This is currently moving either ExTech or descriptive language 
rather than
as an explicit checkpoint.


Note TGL-JT5.
	Context
		Checkpoint 4.4.
	
	Comment
		It is not reasonable a priori to claim that a TS will "eventually
	cover all areas of the specification".
	
	Proposal
		Remove this sentence.

PF - True. Will be re-worded in new CP in Test Management Guideline.


Note TGL-LA13.
	Context
		Checkpoint 4.5. Ensure the ease of use for the test automation.
	Document how the test automation is used. [Priority 1]
	
	Comment
		Of course some minimum documentation is required. High quality
	documentation however requires significant resources and maybe of
	lower priority. If unqualified it's not clear how to check "ease of
	use".
	
	Proposal
		Split to several checkpoints of different priorities?

PF - Ease of use is a hard concept to define or validate. The new focus
is on ensuring that the test framework produces consistent, predictable
results regardless of who is running the tests.


Note TGL-LA14.
	Context
		Checkpoint 4.6. Ensure the framework allows for specification
	versioning and errata levels. Explain how specification versioning and
	errata levels are accommodated by the test framework [Priority 2]
	
	Comment
		Unsynchronized levels of errata used in specs and test suites is
	an easy and most common way to non-conformant and non-interoperable
	implementations. This is absolutely necessary to indicate which
	version of spec and errata level the tests corresponds. Priority 2 is
	not sufficient to enforce this.
	
	Proposal
		Change priority to 1.

PF - This is already covered in GL8 CP8.2 The metadata in the test 
management
system also should include this information. OpsGL focuses on the 
process of
errata while TestGL focuses on maintaining errata info in relation to 
test
cases. Currently the OpsGL CP8.2 is also Priority 2...


Note TGL-JT6.
	Context
		Checkpoint 4.11.
	
	Comment
		Why is there a requirement to demonstrate results verification by
	testing three products?
	
	Proposal
		Better explain why verification by testing three products is
	important.

PF - The part concerning verification by testing three products will
be moved to ExTex or descriptive language rather than being part of
the requirement language.


Note TGL-LA15.
	Context
		Checkpoint 5.2. Ensure the ease of use for results reporting.
	Demonstrate that the results reporting has sorting and filtering
	capabilities. [Priority 1]
	
	Comment
		Result reporting is important but does it really requires sorting
	and filtering? It is more important provide a report format that is
	easily imported into widely available tools (spread sheets, databases,
	etc)
	
	Proposal
		Change priority to 3.

PF - This will likely be split into two checkpoints. One for result 
reporting,
still priority 1, and another for filtering and sorting on a variety of
meta data. The priority for this second checkpoint is not defined yet.

Received on Thursday, 20 March 2003 20:36:33 UTC