QAF Ops Guidelines review action [LONG]

All,

In response to a request [1] from the QA working group forwarded to WebOnt
Jeremy and I have prepared: 
I) summary and general WebOnt positions on the QA Framework (see [QAF-INTRO]),
II) a QA Operational Guidelines [QAF-OPS] Case Study document for OWL,
III) specific comments on the [QAF-OPS] document family as used for this
case study.

We solicit working group review and request discussion time for this
topic during this Thursday's telecon.  Sorry for the length.  Suggest
at least reading I.

-Evan


[1] http://lists.w3.org/Archives/Public/www-webont-wg/2003Sep/0076.html

[QAF-OPS] QA Framework: Operational Guidelines
http://www.w3.org/TR/2003/CR-qaframe-ops-20030912/

[QAF-INTRO] QA Framework: Introduction
http://www.w3.org/TR/2003/CR-qaframe-intro-20030912/

*****

I. Summary of comments on QAF

The Web Ontology (WebOnt) Working Group has just completed a review of
the CR version of the Quality Assurance Framework: Operational
Guidelines [QAF-OPS] assessing how well WebOnt activities in developing
the OWL specification conformed to those guidelines.  This process
required reading and understanding the [QAF-OPS] document and to a lesser
extent understanding other parts of the QAF.  During this process, we
have come to the opinion that these QAF documents should not reach
Recommendation status without significant change.  The goals for the
framework are laudable --to capture and institutionalize best
practices for the fair, open, and effective development and
maintenance of standards that lead to interoperability.  But to
achieve these goals the QAF materials need to be clear, concise,
consistent and compact.  We do not find the QA Framework documents
that we reviewed to have these qualities and believe that the changes
needed to meet these goals will be large enough to force another Last
Call phase.


1 RATIONALE

1.1 Too big and too expensive

The QAF document family is quite large, including an Introduction,
Glossary, and three document subfamilies: the QA Framework Operational
Guidelines, the QA Specification Guidelines[QAF-SPEC], and the QA Test
Guidelines[QAF-TEST].  Each of these families has(or will have), in
addition to its core document, two accompanying checklists, an
Examples & Techniques document, and various templates.  We were
bewildered by the myriad of documents, found inconsistencies among CR
components of the Guidelines document family, and found the Glossary
to be incomplete. In short, we had a frustrating experience.

We are particularly concerned that [QAF-OPS] puts the burden of
understanding this entire document set onto those chartering a new
working group.  Requirements for chartering should be confined to
those items necessary for success of the project to be undertaken by
the new working group. Much more than that endangers the process at
bootstrapping time and may lead to premature decisions which may
haunt the group later on.

1.2 The cost of comprehensive test materials

We note the abstract of the Operational Guidelines scopes the work to 
"building conformance test materials". However, the QA WG charter has the 
goal of "usable and useful test suites".

WebOnt specifically decided to try to build a "usable and useful test 
suite" rather than "conformance test materials". A particular way in which 
we have found our test suite usable and useful is as a means by which to 
explore our issues and to state our issue resolutions. We believe this has 
directly contributed to the quality of our recommendations.

The QA documents, with their emphasis on thoroughness and procedures would 
have significantly added to the cost of the OWL recommendations without, in 
our view, a commensurate increase in quality.

As an example Guideline 10 of the Specification Guidelines mandates the use 
of test assertions for each testable aspect of a specification, our 
understanding is that guideline 7.1 prefers the use of MUST etc in such 
assertions. As an example we apply this to one line in the central OWL 
document (Semantics and Abstract Syntax) i.e. the definition of unionOf in
http://www.w3.org/TR/owl-semantics/direct.html#3.2

The clear if somewhat mathematical definition becomes the following text:

[[
If x is in the interpretation of unionOf(c1 ... cn) then there MUST be some 
i such that x is in the interpretation of ci. If x is in the interpretation 
of ci then x MUST be in the interpretation of unionOf(c1 ... cn).
]]

(Note the two MUSTs are separately testable).

Doing that a hundred times over would have made the document unreadable, 
for the relatively minor advantage of being able to quantify the coverage 
of the specification by the test suite, and to better link each test to the 
aspect of the specification that it was trying to explore. It also seems a 
abuse of RFC 2119 language to use MUST to constrain mathematical or textual 
objects, rather than agents.

Moreover, we could have a test for each of these MUSTs while failing to 
provide well-known challenges that come from combining the features of OWL 
in an awkward way. Thus to have an adequate conformance test suite it does 
not suffice to document each testable assertion and to have a test for 
each, but also every combination has to have a test (an impossible task). 
With our more modest goals of usefulness, experts within our group have 
selected tests from the literature that provide certain known challenging 
feature combinations.


1.3 Constraining other WGs 

The Web Ontology WG believes that Rec track documents should define 
technology and define conformance clauses for software, hardware, and also 
specifications, but should not mandate that W3C working groups or specs 
must be conformant.


1.4 WebOnt did well without the CR QA Framework

As noted in our detailed assessment of OWL QA procedures in comparison
to QAF Guidelines, WebOnt accomplished many of the stated goals of the
QAF without conforming to the [QAF-OPS]. Producing a number of QA
Materials (such as Issues and Test) which helped to discipline and
guide the language development process. As part of this, WebOnt
documented the language in multiple ways which addressed the needs of
the diverse audience for the OWL specification.  Extensive tests were
defined and test results are now available [OWL-TEST-RESULTS] for ten
different OWL tools.  We believe these successes were due in no small
part to the ability for the working group to choose the approaches and
deliverables that matched the needs of the language specified and the
skills, availability, and interests of WebOnt members.  This
flexibility would not have been available had the group been forced to
conform to [QAF-OPS] as currently written.


2 RECOMMENDED CHANGES IN QAF:

We believe that it would be detrimental to future W3C work projects
for all or significant portions of the QA Framework to be incorporated
into the W3C Process document.  Rather we would like to see the QAF
transformed into a flexible, user friendly set of tools, templates,
and guidelines which the Process document can reference rather than
mandate.  The following are some suggestions for moving these
documents towards this goal.  Detailed issues with the QA Framework:
Operational Guidelines family can be found in III, although some are
highlighted in the bullets below.

* Clearly and consistently emphasize QA Framework Introduction as
a starting point into the QAF, possibly removing the QAF-OPS altogether
and using the QA Framework Primer section to serve the same role.

* The use of MUST on the conformance clauses is overly strong.  We
suggest that minimally these should be weakened to SHOULD, and in some
cases should be lowered to MAY to be more generally applicable.  The
QA documents intend to be applicable to all W3C WGs and
specifications, inevitably there will be unforeseen circumstances for
which a considered decision to not implement some part of the QA
framework is appropriate.  For example, in response to a patent
appearing impacting some part of a W3C recommendation it may be
necessary to reissue a new version of that recommendation avoiding the
patent, and it may be necessary to do this very quickly. In such a
case, QA goals could be an obstacle to timeliness.  

* Put all QA specific terms such as Commitment Levels, Priority
Levels, etc into the QAF Glossary. 

* Consider moving boiler plate sections to Intro.  In any case move
term definitions to a place prior to their use in base documents.

* Use consistent document abbreviations throughout the framework.

* Add conformance requirements to checklists.

* Confirm consistency of overlapping information in document families,
in particular Priority of checkpoints between QAF-OPS and OPS-EXTECH
(note problems with this in Guideline 6).

* Make the single HTML file version of QAF-OPS normative (if QAF-OPS
stays an independent document).  

* Either change the Operational Examples and Techniques to contain
reusable examples or change its name to reflect its true content.
Suggest "Operational Case Studies and Techniques."



[OWL-TEST-RESULTS] http://www.w3.org/2003/08/owl-systems/test-results-out


*****

II. QAF-OPS Case Study for OWL

The following is a case study documenting the quality assurance
activities undertaken by the Web Ontology working group during
development of the OWL language. It is structured as a conformance
evaluation of WebOnt QA activities as prescribed by the CR version of
the QA Framework: Operational Guidelines [QAF-OPS].  We have included
within this text the Checkpoints, Conformance Requirements and
occasional Guidelines from QAF-OPS as context for readers from WebOnt.
Text documenting WebOnt actions is prepended with "WebOnt:".

===

Checkpoint 1.1. Define QA commitment levels for operations,
specifications, and test materials. [Priority 1]

Conformance requirements:
the WG MUST define its commitment level to QA Framework: Operational
Guidelines (this specification) -- A, AA, or AAA;

for any Recommendations that it plans to produce, the WG MUST define
its commitment level to QA Framework: Specification Guidelines -- A,
AA, or AAA;

for any Test Materials that it plans to produce or adopt, the WG MUST
define its commitment level to QA Framework: Test Guidelines -- A, AA,
or AAA.

A new or rechartering Working Group MUST define its QA commitment
level in its Charter; 

an existing Working Group MUST document its QA commitment level in
some consensus record.

WebOnt: 
No commitment has been made to QAF by WebOnt.  The WG has looked at
the QAF-Specification Guidelines with mixed opinion.  The QA Test
Guidelines have yet to be reviewed by WebOnt.  This text represents
the WGs initial response to the QA Ops Guidelines and any WG decision
about commitment would occur subsequent to completion and working
group consideration of this response.


Checkpoint 1.2. Commit to test materials. [Priority 2]

Conformance requirements:

the WG MUST commit to produce or adopt at least some test materials
for each of the WG's specifications before it becomes Recommendation;

a new or rechartering Working Group MUST define its test materials
commitment in its Charter;

an existing Working Group MUST document its test materials commitment
in some consensus record.

WebOnt:

WebOnt did not make any commitment to test materials in its charter.
Committed resources to [TEST] document when TEST taskforce was formed
(see consensus record of Feb 28 2002 telecon
http://lists.w3.org/Archives/Public/www-webont-wg/2002Mar/0029.html)

The test case document became a Working Draft component of the OWL
specification when published on 24 October 2002.  Subsequent versions
of the Test document were part of LC and CR specifications and key in
the exit criteria for CR.


Checkpoint 1.3. Commit to complete test materials. [Priority 3]

Conformance requirements:

the WG MUST commit to produce or adopt a complete test materials
before Recommendation, where complete is defined as: at least one test
case for every identifiable conformance requirement of the
specification;

a new or rechartering Working Group MUST define its test materials
commitment in its Charter;

an existing Working Group MUST document its test materials commitment
in some consensus record.

WebOnt: 

Conformance classes were defined and reflected in the OWL Test
document.  However, the tests were not complete in regard to testing
all the features of the language or all its envisioned usages.  It was
felt that it was impossible to know what the complete list of usage
classes would be for the language and that a complete conformance test
suite would be inappropriate for a technology where considerable
development was still expected.  Instead a commitment was made to
produce an open set of tests which continued to expand and evolve
until PR in order to: 1) Illustrate usages of language features, 2)
illustrate characterizing features of known implementation classes, 3)
test the current boundaries of those implementation classes, 4)
capture and illustrate resolutions of issues with the design of the
language, and 5) clarify user confusion discovered by tool
implementers other researchers.

This position was discussed both in the working meetings of the
group and in its email list and was documented in published drafts
of the Test document as early as October 2002.


Checkpoint 1.4 Enumerate QA deliverables and expected milestones.
[Priority 1]

Conformance requirements:
a new or rechartering Working Group MUST document its QA deliverables
and milestones in its Charter;

an existing Working Group MUST document its QA deliverables and
milestones in some consensus record.

WebOnt: 

The charter for this working group makes no explicit mention of QA
deliverables or milestones.  However, three OWL documents: "OWL Web
Ontology Language Use Cases and Requirements", "OWL Web Ontology
Language Guide", and "OWL Web Ontology Language Test Cases" were
committed to early in the WG's lifecycle.  These documents provide the
QA deliverables: use cases, primer, and a collection of test
assertions respectively.


Checkpoint 1.5. Define QA criteria for Recommendation-track
advancement. [Priority 2]

a new or rechartering Working Group MUST, in its charter, specify the
QA criteria required for transition of the WG's specifications between
major Recommendation-track maturity levels;

an existing Working Group MUST, in some consensus record, specify the
QA criteria required for transition of the WG's specifications between
major Recommendation-track maturity levels.

WebOnt: 
As already stated for checkpoint 1.4, no QA deliverables or milestones
were a part of the charter.  Recommendation-track phase change
criteria were keyed to implementation experience described in terms of
the test cases, as well as, to resolution of public comment on the QA
materials already noted above.  This required completion of all these
materials prior to entering PR.


Guideline 2. Commit to resource level for Working Group QA activities.

Checkpoint 2.1. Address where and how conformance test materials will
be produced.  [Priority 1]

a new or rechartering Working Group MUST, in its charter, address who
will produce its test materials and how;

an existing Working Group MUST, in some consensus record, document who
will produce its test materials and how.

WebOnt:  
What this really means is whether Test Materials (TM) will be produced
within the WG or by some other party (organization or individual). As
already stated WebOnt committed to produce its own test cases.


Checkpoint 2.2. Address QA staffing commitments. [Priority 1]

Conformance Requirements:

a new or rechartering Working Group MUST, in its charter, commit to a
staffing resource level for the tasks necessary to meet its total QA
commitments according to its where-and-how plan;

an existing Working Group MUST, in some consensus record, commit to a
staffing resource level for the tasks necessary to meet its total QA
commitments according to its where-and-how plan.

WebOnt:
Committed resources to Test document when TEST taskforce was formed
(see consensus record of Feb 28 2002 telecon
http://lists.w3.org/Archives/Public/www-webont-wg/2002Mar/0029.html)


Checkpoint 2.3. Request allocation of QA resources to the Working
Group.  [Priority 1]

a new or rechartering Working Group MUST, in its Call for
Participation, request that participating members include QA
specialists in their staff-resource allocation to the WG;

an existing Working Group MAY make an external appeal for QA-specific
resources in one of various other ways.

WebOnt: 
WebOnt did not request QA-specific resources in its call for
participation.  Even so, the group had sufficient resources to prepare
the Test, Guide, and Use Case documents already mentioned.


Guideline 3. Synchronize QA activities with the specification
milestones.

Checkpoint 3.1. Synchronize the publication of QA deliverables and the
specification's drafts. [Priority 2]

the Working Group MUST publish QA deliverables, including at least the
test materials to which the WG has committed, concurrently with each
Working Group specification publication milestone.

WebOnt: 
The test cases were not ready for the OWL Last Call, and were published 
shortly after. This means that one or two tests in the OWL Test Last Call 
reflect last call issue resolutions, rather than the text of the other OWL 
last call documents. However, this did not appear to present any difficulties.
It may have been easier if we had been clearer in our planning and had 
decided earlier that we would do that. The test document was central in the 
Candidate Rec phase, although in practice implementors were encouraged to 
work with the public editors' draft of the test document (particularly the 
list of proposed, approved and obsoleted tests in the test manifest). This 
allowed rapid feedback on WG decisions and new tests. As we approach 
Proposed Recommendation, it is arguable that we would have done better if 
we had planned for a staggered release, with the test document coming last. 
As it is, key conformance clauses are in the test document which prevented 
such staggering.


Checkpoint 3.2. Support specification versioning/errata in QA
deliverables. [Priority 1]

Conformance Requirements:

the Working Group's test materials MUST support the unambiguous
association of test materials versions to specification versions and
errata levels;

specification versions and errata support of the Working Group's test
materials MUST be documented in test materials documentation;

the Working Group SHOULD include specification versioning/errata
considerations in any other QA deliverables such as intermediate
planning documents.

WebOnt: 
This checkpoint is applicable only to post recommendation
specifications which do not yet exist for OWL.  That scope is not
clear from the checkpoint or its conformance requirements.


Guideline 4. Define the QA process.

Checkpoint 4.1. Appoint a QA moderator. [Priority 1]

Conformance Requirements:

the Working Group MUST identify a person to manage the WG's quality
practices.

WebOnt:
To the extent that a QA moderator would be strictly concerned with TM
development, the editor and co-editor of the Test document have
fulfilled that role in WebOnt.  


Checkpoint 4.2. Appoint a QA task force. [Priority 2]

Conformance Requirements:

the Working Group MUST identify and assign a QA task force for the
tasks necessary to meet the QA commitment level and the committed QA
deliverables, as identified in checkpoints 1.1 - 1.5.

WebOnt: 
A Test taskforce was formed but most test discussions were conducted
in the working group as a whole and test material contributors were
not confined to those officially belonging to the Test taskforce.  All
taskforces within webont were treated more like roles taken on by the
whole group for addressing specific aspects of specification
development, rather than subsets of the group primarily focused on
those aspects.


Checkpoint 4.3 Produce the QA Process Document [Priority 1]

Conformance requirements:

the Working Group MUST produce a QA Process Document; 

the Working Group's QA Process Document MUST be publicly readable; 

the Working Group's QA Process Document MUST address at least all of
the topics required of it by other checkpoints in these operational
guidelines.

WebOnt: 
WebOnt did not produce a QA process document, per se.  The test
document outlined the character, structure, and form of the test
materials, as well as, explaining who and how tests were created,
approved and modified.  There is no evidence that the absence of a
dedicated QA process document hampered the development of OWL test
material or other quality assurance related materials.


Checkpoint 4.4. Specify means for QA-related communication. [Priority 2]

Conformance requirements:

the Working Group's QA Process Document MUST specify at least one
public archived mailing list for QA announcements and submission of
public QA comments; 

the Working Group's QA Process Document MUST establish a publicly
readable "Test" Web page.

WebOnt: 
The Test document specifies public-webont-comments@w3.org as the
appropriate place for submission of implementation experience.
Significant discussion on test cases has taken place there,
particularly as OWL nears PR.  Most of the test materials are packaged
within the Test document.  However, a web repository is also provided
for accessing test cases.


Checkpoint 4.5. Define branding policy details. [Priority 3]

Conformance Requirements:
the WG MUST document in its QA Process Document the branding policy
details, if branding is to be supported.

WebOnt: 
WebOnt does not currently sanction any OWL conformance claims,
therefore this checkpoint is not applicable.


Guideline 5. Plan test materials development. [Priority 2]

Conformance requirements:

the Working Group's QA Process Document MUST define a framework for
test materials development, that at least describes how to develop,
document and use the tests.

WebOnt: 
(based on a literal reading of the above requirement) As previously
mentioned, Appendix A of the Test document describes how tests can be
created and submitted.


Checkpoint 5.2. Ensure test materials are documented and usable for
their intended purposes. [Priority 1]

Conformance requirements:

the Working Group MUST have user documentation for its test materials
that instructs the use of the test materials for the full range of
their intended purposes.

Webont: 
OWL tests are embedded in a document which explains the test
type, class and purpose. Manifest files provide these same descriptions in
the form of machine readable metadata.  Implementation testing using
these test cases was conducted by many who were not party to the
development of the test materials or any other working group
activities, yet were able to run the tests and provide machine
readable result files.


Checkpoint 5.3. Define a contribution process. [Priority 2]
Conformance requirements:

the Working Group MUST describe in its QA Process Document where, how,
by whom, and to whom test materials submissions are to be made.

WebOnt: This is all outlined in the Creation portion of Appendix A of
the Test document.


Checkpoint 5.4. Address license terms for submitted test
materials. [Priority 1]

Conformance requirements:

in its QA Process Document the Working Group MUST define a submission
license policy applicable to test materials submitted to the WG by
external parties;

the Working Group's submission license policy MUST include at least an
outline of terms, conditions, constraints, and principles that will
govern acceptable submissions to the WG.

WebOnt: 
Test contributions were restricted to those within the working group.
The only licensing terms were those normally associated with W3C
working drafts.  The licensing boilerplate suggested in the test
document's Stylistic Preferences appendix implied W3C copyright for
all tests.

One potential test contribution appeared to be discouraged by this (the 
handing of copyright to W3C). Offlist, the editor encouraged the submitter 
to suggest different terms, but that appears to have presented a hurdle, 
and not contribution or further discussion of IPR terms was had.


Checkpoint 5.5. Define review procedures for submitted test
materials. [Priority 2]

Conformance requirements:

in its QA Process Document, the Working Group MUST define a procedure
for reviewing test materials contributions; 

the Working Group's procedure for reviewing test materials
contributions MUST at least address criteria for accuracy, scope, and
clarity of the tests.

WebOnt: 
Test: Appendix A describes the process for WG Approval of tests.
Implied by this is review of the test, but no explicit process or
criteria are specified for use in such a review.  During CR the WG
tended to accept test passes by two independent systems as sufficient
evidence for approving a test (in the absence of opposition).


Guideline 6. Plan test materials publication.

WebOnt discussion. The handling of web publication and access test
materials for webont is very similar to the approach taken by the SVG
project.  A W3C hosted CVS repository is used to manage the material.
Access is via website or test document.  A tabular view of the
submitted implementation test results is also available.


Checkpoint 6.1. Ensure a suitable repository location for test
materials. [Priority 1]

Conformance requirements:

the Working Group MUST keep its test materials in repository locations
that are secure, reliable, and freely accessible.

WebOnt: 
WebOnt test materials are kept in a CVS repository hosted by the W3C.


Checkpoint 6.2. Define the licenses applicable to published test
materials. [Priority 1]

Conformance requirements:

the Working Group MUST, in its QA Process Document, define the
licenses that are applicable to published test materials.

WebOnt: 
WebOnt tests are part of the Test document which clearly indicates in
its title block that W3C software licensing rules apply.


Checkpoint 6.3. Describe how and where the test materials will be published. 
[Priority 2]

Conformance requirements:

in its QA Process Document the Working Group MUST document the planned
Web location and method for publication of its test materials.

WebOnt: The OWL Test document states where and how the tests are
published on the web.


Checkpoint 6.4. Provide a conformance verification disclaimer with the
test materials. [Priority 1]

WebOnt: 
The OWL Test document includes no conformance verification disclaimer.
This could be because it already asserts that the tests it contains do
not constitute an OWL conformance test suite.


Checkpoint 6.5. Promote testing and the publication of test results. 
[Priority 2]

Conformance requirements:

in its QA Process Document the Working Group MUST document a plan to
engage implementors to participate in conformance testing activities;

in its QA Process Document the Working Group MUST document a plan to
encourage the publication of test results, including sample scenarios
of where and how such publication can be done;

in its QA Process Document the Working Group MAY identify a
WG-sponsored Web site for publishing collected results or a directory
of results.

WebOnt: 
WebOnt has no QA Process Document, per se.  The OWL Test document
which contains material similar to that anticipated for a Process
Document, does not contain plans outlining how to involve implementors
in conformance testing activities for OWL, nor a plans for collecting
and publishing the results.  Even so, implementors did do a great deal
of testing, submitting the results to the group.  WebOnt makes these
results available through an OWL Test Results page that displays them
by test category, by test, and by implementation.


Guideline 7. Plan the transfer of test materials to W3C if needed.

WebOnt: WebOnt developed all its test materials within the group,
rendering most of the checkpoints for this Guideline not applicable.


Checkpoint 7.1. Perform a quality assessment of any test materials
that are candidates for transfer.  [Priority 2]

Conformance requirements:

as a part of any test materials transfer process, the Working Group
MUST perform and record an assessment of the quality of the test
materials.

Webont: No transfer needed. Not applicable.

Checkpoint 7.2. Identify sufficient staff resources to meet the needs
of any transferred test materials. [Priority 1]

Conformance requirements:

as a part of any test materials transfer process, in some consensus
document the Working Group MUST identify and assign staff resources
for the tasks associated with ongoing test materials development and
maintenance after the transfer.

Webont: 
The WG members submitting the tests did the bulk of the work required
to transform the tests from their original form to that appropriate
for the OWL Test suite. Further work was done by the editors.


Checkpoint 7.3. For any transferred test materials, resolve all IPR
issues with the external party that produced the test materials.

Conformance requirements:

as a part of any test materials transfer process, the Working Group
MUST have a documented agreement with the external entity that covers
the IPR aspects that are applicable to the transferred materials.

Webont: Not applicable.


Guideline 8. Plan for test materials maintenance.


WebOnt discussion: The Test cases have been made a part of the OWL
specification and are expected to be maintained along with the
specification.  The working group is expected to close subsequent to
OWL Recommendation and an errata period.  No plans have been made for
supporting Test material maintenance outside of the working group.


Checkpoint 8.1. Provide for the long-term maintenance of the
contribution and review procedures.  [Priority 3]

Conformance requirements:

in some consensus document, the Working Group MUST define a plan and
identify resources for the maintenance of the test materials'
contribution and review procedures throughout the entire life cycle of
the test materials and the Recommendation itself.

WebOnt: No plans have been made for the maintenance of OWL test
materials when WebOnt disbands.


Checkpoint 8.2. Specify a test materials update procedure to track new
specification versions/errata.  [Priority 1]

Conformance requirements:

in its QA Process Document the Working Group MUST specify procedures
to update the test materials to track new specification versions and
errata levels.

WebOnt: WebOnt has not documented any unique procedures for updating
test materials relative to post-Recommendation versions and errata
levels. However, since the test materials are a part of the
specification set for OWL, any published release of OWL should have
consistent test materials by virtue of normal publication procedures.


Checkpoint 8.3. Identify a procedure for test validity appeals.
[Priority 2]

Conformance requirements:

in its QA Process Document the Working Group MUST identify a
communication channel for appeals of test validity and a procedure for
resolving such appeals.

WebOnt: 
The Test document included a process for test vetting and subsequent
decommisioning if appropriate.  During the development of OWL, the
Issue list provided a formal mechanism leading to creation or removal
of tests.  During recommendation process phase-transitions, public
comment procedures were used and new information sometimes led to new
or revisited issues and thus new or revised tests.  These processes
all were designed to work in the context of an active WebOnt wg, and
will not apply after its dissolution.


[QAF-OPS] QA Framework: Operational Guidelines
http://www.w3.org/TR/2003/CR-qaframe-ops-20030912/

[TEST] OWL Web Ontology, Test Cases
http://www.w3.org/TR/2003/PR-owl-test-20031215/


*****


III. Detailed comments on QA Operational Guidelines document - [QAF-OPS]

New Working groups - within Guideline 1 the following is said about
new working groups in the context of Operational Guidelines, "Working
Groups that are renewing their charters are considered the same as new
WGs."  Perhaps this is as distiguished from "extending" their charters
(i.e. looking for an extension to finish work on work items already
well underway).  Any requirements implied by these guidelines should
only apply to new work items begun after the QAF becomes a
recommendation. When WGs are extending their charter they are already
straining the availability of participating members and endangering
the schedules of dependent projects.  Adding new requirements at such
a time needlessly endangers the goals of the WG and dependent groups
and projects.

The checkpoints in the QA OPS document are actually compound
checkpoints (see conformance requirements for each checkpoint in
QAF-OPS).  The OPS-CHECKLIST and OPS-ICS tables elide this and thus
hide the complexity and resulting cost of meeting the QA requirements.
Furthermore, for those WGs who do review their QA conformance with
this checklist it will be necessary to review each requirement and
useful to capture a record of how the WG addressed the requirement.
In other words, the tables would be more useful if the conformance
requirements were included.

Commitment Levels - the following levels are enumerated but not
explained were used or in the QA Introduction or QA Glossary: A, AA,
and AAA.  This material should precede its use.  It currently appears
in section 4 of the QAF-OPS, and no forward reference is provided
where used.

Document structure -  The components of the QA Ops document are not 
sufficiently large or independent of each other to justify the
compound structure of this document.  I found it quite frustrating to
navigate this version while relating the checkpoints to our WG
actions.  Recommend making the single HTML file the normative version
of QAF-OPS.

What constitutes a QA deliverable or milestone?: Checkpoint 1.4 asks
about enumeration of QA deliverables without providing a comprehensive
definition for such things.  The discussion section of the QA OPS doc
provides a partial(?) list but ironically the Examples & Techniques
doc provides no such list (although a few examples not in the OPS doc
list are scattered among the text of the Examples doc for this
checkpoint).

Bootstrapping - These guidelines require that considerable planning and
assignment of resources take place prior to chartering a WG.  There
are several dangers with such an approach: 1) the weight of work and
high commitment requirements prior to chartering could doom the
chartering process to failure, 2) planning of work and assignment of
resources prior to WG formation could result in poor choices since the
membership of the group was not yet even determined much less reached
any common set of thinking, and 3) those members of the WG who had not
been involved in its chartering would feel no ownership or
commitment to plans made prior to their involvement.  

QA materials - Checkpoint 2.1 and its conformance requirements concern
Test Materials, but the Rationale talks of QA deliverables and
commitment which is a wider concern.  Which is it?

Where-and-how plan - the term "where-and-how plan" is used without
being explicitly defined.

QA moderator -  This seems to be a pseudonym for a TM development
lead.  If that is what is meant, then why obfuscate by using the
broader term?

Checkpoint 5.1 Define a framework for test material development - The
Conformance Requirement for this checkpoint talks of a framework while
the Rationale talks of a plan.  I am not sure how detailed a plan is
wanted here, and how this requirement differs from previous
checkpoints that talked of creating a scenario for how test materials
were to be developed.  This part of the Checkpoint description needs to
be made consistent and differentiated from the requirements in
checkpoints 2.1 and 2.2.

QAF-OPS and OPS-EXTECH docs out of synch.-  The CR versions of the
QA Framework: Operational Guidelines
<http://www.w3.org/TR/2003/CR-qaframe-ops-20030922/> and the
QA Framework: Operational Examples & Techniques
<http://www.w3.org/QA/WG/2003/09/qaframe-ops-extech-20030912/
documents do not agree on priorities for 6.x checkpoints.  The QAF-OPS
doc sets higher priorities than the OPS-EXTECH doc for all checkpoints
for Guideline 6 save 6.3.

Checkpoint fragmentation - Checkpoint 6.2 is essentially a detail
concerning checkpoint 5.4.  Clearer and more succinct QA documentation
would merge these.

Received on Tuesday, 16 December 2003 18:16:44 UTC