- From: Sandro Hawke <sandro@w3.org>
- Date: Fri, 02 Jan 2004 11:28:22 -0500
- To: Jim Hendler <hendler@cs.umd.edu>
- Cc: www-qa@w3.org, Guus Schreiber <schreiber@cs.vu.nl>, Dan Connolly <connolly@w3.org>, Jeremy Carroll <jjc@hplb.hpl.hp.com>
Just to clarify, this is *not* what I'm going to be talking about on
Monday. The timing is an odd coincidence. My focus will be on how
WebOnt (and to a lesser extent RDF Core) used public test results to
monitor and guide implementations, get through CR, the software /
systems architecture we used, and test-driven development.
-- sandro
> In response to the Call for Implementations with respect to QA
> documents, and particularly in response to your request to our
> Working Group for a Case Study [1], the Web Ontology Working Group
> has produced the following case study. The Working Group has
> reviewed this case study and approved sending it to you [2]. Also,
> largely based on the results of this case study, the WG has approved
> some consensus comments on your documents, these will be sent in a
> separate email.
> Jim Hendler, WOWG Co-Chair, for the Working Group
>
> [1] http://lists.w3.org/Archives/Public/www-webont-wg/2003Sep/0076.html
> [2] http://lists.w3.org/Archives/Public/www-webont-wg/2003Dec/0095.html
>
> *****
>
> QAF-OPS Case Study for OWL
>
> The following is a case study documenting the quality assurance
> activities undertaken by the Web Ontology working group during
> development of the OWL language. It is structured as a conformance
...
Received on Friday, 2 January 2004 11:25:16 UTC