- From: Jim Hendler <hendler@cs.umd.edu>
- Date: Fri, 2 Jan 2004 11:08:22 -0500
- To: www-qa@w3.org
- Cc: Jeremy Carroll <jjc@hplb.hpl.hp.com>, Guus Schreiber <schreiber@cs.vu.nl>, Dan Connolly <connolly@w3.org>, Sandro Hawke <sandro@w3.org>
The following are Web Ontology Working Group comments resulting from performing the OWL - QA OpsGL case study sent in a separate email. These comments represent an official working group position on the QAF as decided on 18 Dec 2003 [1]. The comments consist of: I Summary of comments on QAF, II Detailed comments on QA Operational Guidelines document. [1] http://lists.w3.org/Archives/Public/www-webont-wg/2003Dec/0095.html ***** I. Summary of comments on QAF The Web Ontology (WebOnt) Working Group has just completed a review of the CR version of the Quality Assurance Framework: Operational Guidelines [QAF-OPS] assessing how well WebOnt activities performed in developing the OWL specification conformed to those guidelines. This process required reading and understanding the [QAF-OPS] document and to a lesser extent understanding other parts of the QAF. During this process, we have come to the opinion that these QAF documents should not reach Recommendation status without significant change. The goals for the framework are laudable --to capture and institutionalize best practices for the fair, open, and effective development and maintenance of standards that lead to interoperability. But to achieve these goals the QAF materials need to be clear, concise, consistent and compact. We do not find the QA Framework documents that we reviewed to have these qualities and believe that the changes needed to meet these goals will be large enough to force another Last Call phase. 1 RATIONALE 1.1 Too big and too expensive The QAF document family is quite large, including an Introduction, Glossary, and three document subfamilies: the QA Framework Operational Guidelines, the QA Specification Guidelines[QAF-SPEC], and the QA Test Guidelines[QAF-TEST]. Each of these families has(or will have), in addition to its core document, two accompanying checklists, an Examples & Techniques document, and various templates. We were bewildered by the myriad of documents, found inconsistencies among CR components of the Guidelines document family, and found the Glossary to be incomplete. In short, we had a frustrating experience. We are particularly concerned that [QAF-OPS] puts the burden of understanding this entire document set onto those chartering a new working group. Requirements for chartering should be confined to those items necessary for success of the project to be undertaken by the new working group. Much more than that endangers the process at bootstrapping time and may lead to premature decisions which may haunt the group later on. 1.2 The cost of comprehensive test materials We note the abstract of the Operational Guidelines scopes the work to "building conformance test materials". However, the QA WG charter has the goal of "usable and useful test suites". WebOnt specifically decided to try to build a "usable and useful test suite" rather than "conformance test materials". A particular way in which we have found our test suite usable and useful is as a means by which to explore our issues and to state our issue resolutions. We believe this has directly contributed to the quality of our recommendations. The QA documents, with their emphasis on thoroughness and procedures would have significantly added to the cost of the OWL recommendations without, in our view, a commensurate increase in quality. As an example Guideline 10 of the Specification Guidelines mandates the use of test assertions for each testable aspect of a specification, our understanding is that guideline 7.1 prefers the use of MUST etc in such assertions. As an example we apply this to one line in the central OWL document (Semantics and Abstract Syntax) i.e. the definition of unionOf in http://www.w3.org/TR/owl-semantics/direct.html#3.2 The clear if somewhat mathematical definition becomes the following text: [[ If x is in the interpretation of unionOf(c1 ... cn) then there MUST be some i such that x is in the interpretation of ci. If x is in the interpretation of ci then x MUST be in the interpretation of unionOf(c1 ... cn). ]] (Note the two MUSTs are separately testable). Doing that a hundred times over would have made the document unreadable, for the relatively minor advantage of being able to quantify the coverage of the specification by the test suite, and to better link each test to the aspect of the specification that it was trying to explore. It also seems a abuse of RFC 2119 language to use MUST to constrain mathematical or textual objects, rather than agents. Moreover, we could have a test for each of these MUSTs while failing to provide well-known challenges that come from combining the features of OWL in an awkward way. Thus to have an adequate conformance test suite it does not suffice to document each testable assertion and to have a test for each, but also every combination has to have a test (an impossible task). With our more modest goals of usefulness, experts within our group have selected tests from the literature that provide certain known challenging feature combinations. 1.3 Constraining other WGs The Web Ontology WG believes that Rec track documents should define technology and define conformance clauses for software, hardware, and also specifications, but should not mandate that W3C working groups or specs must be conformant. Thus it would be inappropriate for the QA WG to mandate conformance with procedures and guidelines in QAF documents, yet this possibility is admitted in the Normative Guidelines section of the QA Framework: Introduction. 1.4 WebOnt did well without the CR QA Framework As noted in our detailed assessment of OWL QA procedures in comparison to QAF Guidelines, WebOnt accomplished many of the stated goals of the QAF without conforming to the [QAF-OPS]. Producing a number of QA Materials (such as Issues and Test) which helped to discipline and guide the language development process. As part of this, WebOnt documented the language in multiple ways which addressed the needs of the diverse audience for the OWL specification. Extensive tests were defined and test results are now available [OWL-TEST-RESULTS] for more than a dozen different OWL tools. We believe these successes were due in no small part to the ability for the working group to choose the approaches and deliverables that matched the needs of the language specified and the skills, availability, and interests of WebOnt members. This flexibility would not have been available had the group been forced to conform to [QAF-OPS] as currently written. 2 RECOMMENDED CHANGES IN QAF: We believe that it would be detrimental to future W3C work projects for all or significant portions of the QA Framework to be incorporated into the W3C Process document. Rather we would like to see the QAF transformed into a flexible, user friendly set of tools, templates, and guidelines which the Process document can reference instead of mandate. The following are some suggestions for moving these documents towards this goal. Detailed issues with the QA Framework: Operational Guidelines family can be found in II, although some are also highlighted in the bullets below. * Clearly and consistently emphasize QA Framework Introduction as a starting point into the QAF, possibly removing the QAF-OPS altogether and using the QA Framework Primer section to serve the same role. * The use of MUST on the conformance clauses is overly strong. We suggest that minimally these should be weakened to SHOULD, and in some cases should be lowered to MAY to be more generally applicable. The QA documents intend to be applicable to all W3C WGs and specifications, inevitably there will be unforeseen circumstances for which a considered decision to not implement some part of the QA framework is appropriate. For example, in response to a patent appearing impacting some part of a W3C recommendation it may be necessary to reissue a new version of that recommendation avoiding the patent, and it may be necessary to do this very quickly. In such a case, QA goals could be an obstacle to timeliness. * Put all QA specific terms such as Commitment Levels, Priority Levels, etc into the QAF Glossary. * Consider moving boiler plate sections to Intro. In any case move term definitions to a place prior to their use in base documents. * Use consistent document abbreviations throughout the framework. * Add conformance requirements to checklists. * Confirm consistency of overlapping information in document families, in particular Priority of checkpoints between QAF-OPS and OPS-EXTECH (note problems with this in Guideline 6). * Make the single HTML file version of QAF-OPS normative (if QAF-OPS stays an independent document). * Either change the Operational Examples and Techniques to contain reusable examples or change its name to reflect its true content. Suggest "Operational Case Studies and Techniques." [OWL-TEST-RESULTS] http://www.w3.org/2003/08/owl-systems/test-results-out ***** II. Detailed comments on QA Operational Guidelines document - [QAF-OPS] New Working groups - within Guideline 1 the following is said about new working groups in the context of Operational Guidelines, "Working Groups that are renewing their charters are considered the same as new WGs." Perhaps this is as distiguished from "extending" their charters (i.e. looking for an extension to finish work on work items already well underway). Any requirements implied by these guidelines should only apply to new work items begun after the QAF becomes a recommendation. When WGs are extending their charter they are already straining the availability of participating members and endangering the schedules of dependent projects. Adding new requirements at such a time needlessly endangers the goals of the WG and dependent groups and projects. The checkpoints in the QA OPS document are actually compound checkpoints (see conformance requirements for each checkpoint in QAF-OPS). The OPS-CHECKLIST and OPS-ICS tables elide this and thus hide the complexity and resulting cost of meeting the QA requirements. Furthermore, for those WGs who do review their QA conformance with this checklist it will be necessary to review each requirement and useful to capture a record of how the WG addressed the requirement. In other words, the tables would be more useful if the conformance requirements were included. Commitment Levels - the following levels are enumerated but not explained were used or in the QA Introduction or QA Glossary: A, AA, and AAA. This material should precede its use. It currently appears in section 4 of the QAF-OPS, and no forward reference is provided where used. Document structure - The components of the QA Ops document are not sufficiently large or independent of each other to justify the compound structure of this document. We found it quite frustrating to navigate this version while relating the checkpoints to our WG actions. Recommend making the single HTML file the normative version of QAF-OPS. What constitutes a QA deliverable or milestone?: Checkpoint 1.4 asks about enumeration of QA deliverables without providing a comprehensive definition for such things. The discussion section of the QA OPS doc provides a partial(?) list but ironically the Examples & Techniques doc provides no such list (although a few examples not in the OPS doc list are scattered among the text of the Examples doc for this checkpoint). Bootstrapping - These guidelines require that considerable planning and assignment of resources take place prior to chartering a WG. There are several dangers with such an approach: 1) the weight of work and high commitment requirements prior to chartering could doom the chartering process to failure, 2) planning of work and assignment of resources prior to WG formation could result in poor choices since the membership of the group was not yet even determined much less reached any common set of thinking, and 3) those members of the WG who had not been involved in its chartering would feel no ownership or commitment to plans made prior to their involvement. QA materials - Checkpoint 2.1 and its conformance requirements concern Test Materials, but the Rationale talks of QA deliverables and commitment which is a wider concern. Which is it? Where-and-how plan - the term "where-and-how plan" is used without being explicitly defined. QA moderator - This seems to be a pseudonym for a TM development lead. If that is what is meant, then why obfuscate by using the broader term? Checkpoint 5.1 Define a framework for test material development - The Conformance Requirement for this checkpoint talks of a framework while the Rationale talks of a plan. We are not sure how detailed a plan is wanted here, and how this requirement differs from previous checkpoints that talked of creating a scenario for how test materials were to be developed. This part of the Checkpoint description needs to be made consistent as well as differentiated from the requirements in checkpoints 2.1 and 2.2. QAF-OPS and OPS-EXTECH docs out of synch.- The CR versions of the QA Framework: Operational Guidelines <http://www.w3.org/TR/2003/CR-qaframe-ops-20030922/> and the QA Framework: Operational Examples & Techniques <http://www.w3.org/QA/WG/2003/09/qaframe-ops-extech-20030912/ documents do not agree on priorities for 6.x checkpoints. The QAF-OPS doc sets higher priorities than the OPS-EXTECH doc for all checkpoints for Guideline 6 save 6.3. Checkpoint fragmentation - Checkpoint 6.2 is essentially a detail concerning checkpoint 5.4. Clearer and more succinct QA documentation would merge these. -- Professor James Hendler http://www.cs.umd.edu/users/hendler Director, Semantic Web and Agent Technologies 301-405-2696 Maryland Information and Network Dynamics Lab. 301-405-6707 (Fax) Univ of Maryland, College Park, MD 20742 240-277-3388 (Cell)
Received on Friday, 2 January 2004 11:07:37 UTC