RE: TEST: formalizing f2f decisions

> Jeremy - So, the overall
> purpose of these test cases
> are to test for conformance?

No, not in my book.



One view, which I currently believe you hold, is that the purpose of the OWL
test cases is to create a conformance suite for OWL.

On this view, we are constrained to a waterfall model of development in
which the test cases are developed after the specs.

A different view is that the test cases are part of the development process;
and that the development process is essentially iterative in nature. Under
this view the *final* tests agreed should present necessary conditions for
conformance, but no claims to coverage are presented. The tests created were
those that were useful to the group in the development process.

<aside><title>iterative testing in software development</title>
As an example of what that might mean in a software development process I
will tell a story.

I found some defects with the DAML module in HP's Jena semantic web toolkit.
I mentioned these to the module author and said that I would fix them. He
replied that he would like to see junit tests in our regression test suite
that the old system failed, and the system with my fixes passed.

Thus, as the ***first*** step of my fixes I created the tests and added them
to the regression tests. Naturally the system failed. I added the fixes and
the system now passes.

The tests are not very interesting, and reflect an imbalance (e.g. we
include a test that the method createDAMLObjectProeprty actually creates a
daml:ObjectProperty triple, whereas we do not include a test that the method
createDAMLClass actually creates a daml:Class triple).

However, when we came to the code review phase, I started by talking the
module owner throught the new tests. He was satisfied that they were
correct, and then it is easy to justify the code changes.

Also the test code was perhaps more complicated than the fixes.
</aside>

Testing in Iterative Spec Development
=====================================

We already have a spec, DAML+OIL. OWL is a revision of that. (I found the
most important aspect of the f2f being a reinforcement of that aspect of our
charter).

At the f2f we agreed (or appeared to agree) some aspects of what we wanted
in the OWL revision.

I am advocating capturing those agreements as test cases, which the current
DAML+OIL language fails, but the intent is that OWL will pass. I am also
advocating that, at this stage, the WG approves these test cases, as
expressing the intent.

As the LANG and SEM group proceed the TEST group will need to keep an eye on
such tests. If it becomes clear that OWL will not pass any specific approved
test, than that isn't a disaster but will need to be brought back to the
WG's attention.

It is also clear that the TEST group will need to maintain such tests. e.g.
at a trivial level today when I create the first tests, I will use the
daml+oil namespace, because we haven't agreed an OWL namespace. Once we have
an owl namespace, changing the namespace in pre-approved tests is a
maintenance job reported to, but not requiring the approval of, the WG.

In this way, the tests allow us to formally capture the WG's intent, and to
keep the LANG and SEM groups honest.

As a side effect we generate a partial conformance suite.



Jeremy

Received on Wednesday, 17 April 2002 04:56:49 UTC