[www-qa] <none>

To The XML Schema WG Task Force; Leonid Arbouzov, John Tebbutt and Kongyi Zhou

Thanks for your comments on the QA Working Group's Test Guidelines 
Document. Although the Test Guidelines have not reached last call 
status yet, we are taking your comments into account for the next 
update of the Test Guidelines. We also thought we should reply to 
your comments as if they were for last call.

At our last Face to Face meeting we had talked about this document 
and had decided that large sections of the document including the 
introduction. Many of the guidelines and checkpoints also need 
re-ordering and rewriting.

The resolution to each comment is listed right after the comment. 
These resolutions were discussed by the WG in a telcon on March 24th.

Thanks again,

Peter Fawcett
Test Guidelines Lead Editor
-----------------------

Note TGL-LA1.
	Context
		Status of this document , 4th and 6th paragraphs "A 
future version
	of this document will be accompanied by a "Specification Examples &
	Techniques" document"

	Comment
		Wrong document name: "Specification Examples & 
Techniques" . This
	is also direct duplication with 4th paragraph: "This part of the
	Framework document family will eventually have an informative
	accompanying QA Framework: Test Examples and Techniques document"

	Proposal
		Drop 4th paragraph and fix 6th paragraph to use "Test 
Examples &
	Techniques"

Resolution:
- Yes. Agree with proposal, this is part of the Intro re-write.


Note TGL-JT1.
	Context
		General

	Comment
		The style of the language is very poor: the document has the
	appearance of having been written hurriedly and never reviewed
	(perhaps not what we would expect from the QA group!). In some cases,
	the language is so garbled as to render sections ambiguous. Recommend
	extensive editorial review and enhancement. In particular: Checkpoint
	3.1, final 2 paragraphs; Checkpoint 4.2; Checkpoint 4.3.

	Proposal
		Improve language.

Resolution:
- Yes, Agree. This was discussed at the last F2F.


Note TGL-LA2.
	Context
		1. Introduction

	Comment
		The Introduction doesn't follow recommendations of SpecGL
	(checkpoints SpecGL 1.1, 1.2,2.1) and doesn't describe the scope of
	the specification and classes of product.

	Proposal
		Follow the structure of introduction used in SpecGL 
and OperGL and
	clearly describe the scope and class of products.

Resolution:
- Yes, Agree. This was specifically discussed at last F2F. Intro 
needs to be re-written to be in accordance with the other documents.


Note TGL-JT2.
	Context
		1. Introduction

	Comment
		The terms "conformance" and "compliance" are interchanged in an
	apparently haphazard fashion. The term "compliance" occurs nowhere
	else in the Framework.

	Proposal
		Replace all occurrences of "compliance" with "conformance".

Resolution:
- Yes, Agree. We do conformance testing not compliance testing. This 
was specifically mentioned at the last F2F.


Note TGL-JT3.
	Context
		1.1 Motivation for this guidelines document. 1st bullet.

	Comment
		First item in bulleted list: it is not realistic to 
assume that it
	is possible to eliminate the possibility of misinterpretation of a
	specification.

	Proposal
		Substitute "minimized" for "eliminated".

Resolution:
- Yes. 'Eliminated' is a nice goal but it's not realistic.


Note TGL-LA3.
	Context
		1.1 Motivation for this guidelines document. 2nd bullet.
	"Developing an open conformance test suite for a specification,
	applicable by any implementation. "

	Comment
		It's not clear what "open" means here. Availability 
and licensing
	policy is in the scope of Operation Guidelines and should be addressed
	there.

	Proposal
		Drop the word "open" here and say something about "openness" in
	Operation Guidelines.

Resolution:
- Yes, there is an attempt to not repeat material from OpsGL. This 
will either be reworded or removed for the next re-write of the 
introduction.


Note TGL-LA4.
	Context
		1.1 Motivation for this guidelines document. 3rd 
bullet. "Ensuring
	the quality of the particular implementation by testing against
	specifications and conducting interoperability testing with other
	available implementations."

	Comment
		There are multiple issues in this sentence. First, a 
quality of an
	implementation is much wider notion that just conformance. Along with
	conformance criteria it may also include performance, reliability,
	interoperability and other criteria. Also interoperability is a
	different notion that conformance to the specification. For example,
	two implementations can be interoperable but non-conformant. And two
	conformant implementations can be non-interoperable.

	Proposal
		Change to: "Improving the quality of the particular 
implementation
	by testing against specifications and conducting interoperability
	testing with other available implementations."

Resolution:
- Yes, It was decided in the last F2F that this needed to be 
re-written as part of the Intro re-write.


Note TGL-LA5.
	Context
		1.1 Motivation for this guidelines document. "A free-to-use
	conformance test suite that covers most if not all of the
	specification requirements, is developed by interested parties across
	industry, and is applicable to any of the specification's
	implementations, provides:"

	Comment
		A freedom to use and a parties involved in the 
development are in
	the scope of Operation Guidelines. These are not really relevant for
	the subsequent logic.

	Proposal
		To change to: "A conformance test suite that covers most if not
	all of the specification requirements, and is applicable to any of the
	specification's implementations, provides:"

Resolution:
- Yes, It was decided in the last F2F that this needed to be 
re-written as part of the Intro re-write and to not duplicate or over 
lap with OpsGL.


Note TGL-LA6.
	Context
		1.2 Navigating through this document

	Comment
		What Guideline covers test quality and especially portability?

	Proposal
		Add guidelines ensuring test quality and portability between
	implementations.

Resolution:
- We are working on introducing a new guideline that covers test 
quality to some degree. Test quality is a very difficult thing to 
quantify. This is a major area of focus for the rewrite.


Note TGL-LA7.
	Context
		Checkpoint 1.1. Identify the target set of 
specifications that are
	being tested "[EX-TECH] For example, XML test suite [@@Link] may not
	include tests that specifically test the URN format, but XSLT [@@Link]
	and XQuery [@@Link] test suites will include many tests for XPath
	functions."

	Comment
		Is this URN or URI?

	Proposal
		Change to URI?

Resolution:
- Yes. That's right. Change to URI.


Note TGL-LA8.
	Context
		Checkpoint 1.5. Identify all the discretionary 
choices defined in
	the specification [Priority 1]

	Comment
		Is this really Priority 1? Other similar checkpoints 
have priority 2.

	Proposal
		Change priority to 2?

Resolution:
- This is now part of the test management system as part of the 
required meta data/information that needs to be stored in the system. 
Rather than imposing a structure on the test suite, allow for the 
test suite to be filtered by different criteria including structure.


Note TGL-JT4.
	Context
		Section 1.5.

	Comment
		Final definition, "Results Verification": presumably 
the intention
	is to determine whether an implementation passes or fails a test, not
	"if a test passes or fails". See also Checkpoint 4.11.

	Proposal
		Clarification is needed.

Resolution:
- Yes, The wording needs work. This will be fixed with the re-write 
of the glossary.


Note TGL-LA9.
	Context
		Checkpoint 2.2. Provide mapping between the test 
suite structure
	and the specification structure. [Priority 1]

	Comment
		All test coverage related items are very important 
and should have
	the same priority.

	Proposal
		Change priority to 2?

Resolution:
- This is now part of the test management system as part of the 
required meta data/information that needs to be stored in the system. 
Rather than imposing a structure on the test suite, allow for the 
test suite to be filtered by different criteria including structure.


Note TGL-LA10.
	Context
		Checkpoint 3.2. Identify publicly available testing techniques.
	List the publicly available testing techniques that have been reused
	[Priority 1]

	Comment
		Sharing testing techniques is important for test development
	performance but is not probably P1.

	Proposal
		Change priority to 2 or 3?

Resolution:
- Not a checkpoint anymore. Now related to ExTech or descriptive 
language. these are good things to do as one is developing a test 
framework or test management system but they are not really testable. 
It's more along the lines of best practices.


Note TGL-LA11.
	Context
		Checkpoint 4.1. List available test frameworks and applicable
	automation. Identify available test frameworks used. If none, justify
	why new frameworks are needed and existing ones could not be used.
	[Priority 1]

	Comment
		This is again related only to performance and 
efficiency of test
	development so can be of lower priority. Besides researching all
	frameworks available in the world could take a lot of unnecessary
	efforts. Besides adaptation efforts should be taken into account. Also
	it would be good to choose such a test framework that could be used
	with different test execution automation systems

	Proposal
		Change priority to 2 or 3? Limit frameworks to already used in
	W3C?

Resolution:
- Not a checkpoint anymore. Now related to ExTech or descriptive 
language. these are good things to do as one is developing a test 
framework or test management system but they are not really testable. 
It's more along the lines of best practices.


Note TGL-LA12.
	Context
		Checkpoint 4.2. Ensure the framework and automation 
are platform
	independent. Demonstrate on 3 platforms. Ensure that the framework and
	automation are built using open standards. [Priority 1]

	Comment
		These are maybe two separate checkpoints. Second one 
about open is
	not very clear and need clarifications. What those "open standard"
	means? For example is perl an open standard? python? tcl?

	Proposal
		Either drop or clarify the second sentence.

Resolution:
- This is currently moving either ExTech or descriptive language 
rather than as an explicit checkpoint.


Note TGL-JT5.
	Context
		Checkpoint 4.4.

	Comment
		It is not reasonable a priori to claim that a TS will 
"eventually
	cover all areas of the specification".

	Proposal
		Remove this sentence.

Resolution:
- True. Will be re-worded in new CP in Test Management Guideline.


Note TGL-LA13.
	Context
		Checkpoint 4.5. Ensure the ease of use for the test automation.
	Document how the test automation is used. [Priority 1]

	Comment
		Of course some minimum documentation is required. High quality
	documentation however requires significant resources and maybe of
	lower priority. If unqualified it's not clear how to check "ease of
	use".

	Proposal
		Split to several checkpoints of different priorities?

Resolution:
- Ease of use is a hard concept to define or validate. The new focus 
is on ensuring that the test framework produces consistent, 
predictable results regardless of who is running the tests.


Note TGL-LA14.
	Context
		Checkpoint 4.6. Ensure the framework allows for specification
	versioning and errata levels. Explain how specification versioning and
	errata levels are accommodated by the test framework [Priority 2]

	Comment
		Unsynchronized levels of errata used in specs and 
test suites is
	an easy and most common way to non-conformant and non-interoperable
	implementations. This is absolutely necessary to indicate which
	version of spec and errata level the tests corresponds. Priority 2 is
	not sufficient to enforce this.

	Proposal
		Change priority to 1.

Resolution:
- This is already covered in GL8 CP8.2 The metadata in the test 
management system also should include this information. OpsGL focuses 
on the process of errata while TestGL focuses on maintaining errata 
info in relation to test cases. Currently the OpsGL CP8.2 is also 
Priority2.


Note TGL-JT6.
	Context
		Checkpoint 4.11.

	Comment
		Why is there a requirement to demonstrate results 
verification by
	testing three products?

	Proposal
		Better explain why verification by testing three products is
	important.

Resolution:
- The part concerning verification by testing three products will be 
moved to ExTex or descriptive language rather than being part of the 
requirement language. The goal was to ensure that what ever result 
system is used, that it is open and supported on a number of systems. 
But requiring some arbitrary number of products to verify this does 
not seem to be the best way to ensure this goal.


Note TGL-LA15.
	Context
		Checkpoint 5.2. Ensure the ease of use for results reporting.
	Demonstrate that the results reporting has sorting and filtering
	capabilities. [Priority 1]

	Comment
		Result reporting is important but does it really 
requires sorting
	and filtering? It is more important provide a report format that is
	easily imported into widely available tools (spread sheets, databases,
	etc)

	Proposal
		Change priority to 3.

Resolution:
- This will likely be split into two checkpoints. One for result 
reporting, still priority 1, and another for filtering and sorting on 
a variety of meta data. The priority for this second checkpoint is 
not defined yet.

Received on Friday, 4 April 2003 16:46:31 UTC