W3C home > Mailing lists > Public > www-dom-ts@w3.org > November 2001

Re: DOM WG F2F Demo and reactions

From: Mary Brady <mbrady@nist.gov>
Date: Mon, 19 Nov 2001 11:25:43 -0500
Message-ID: <005401c17116$d6426820$293b0681@HAPPY>
To: "Dimitris Dimitriadis" <dimitris@ontologicon.com>, <www-dom-ts@w3.org>

----- Original Message -----
From: "Dimitris Dimitriadis" <dimitris@ontologicon.com>
To: <www-dom-ts@w3.org>
Sent: Friday, November 16, 2001 3:22 AM
Subject: DOM WG F2F Demo and reactions

> As far as future ideas is concerned, here is a list of things that were
> discussed:
> 1. Provide a simple, runnable, pre-release distribution in order to have
> people starting testing even now. My reaction was that we on the one
> hand want to ensure the integrity of the test suite, but that we on the
> other hand should allow for people to run tests, since the source and
> all tools needed are publically available in any case. I want to do this
> fairly soon, so we should now try to evaluate the tests we have and
> resolve any issues that exist now. All implementors on the list, please
> check the available code and send comments to correctness of the tests
> to this list, using [Test Review - testname.xml] _your reactions_ as the
> subject to this list as soon as possible.
I wouldn't have any problem with releasing the suite as is to the WG.  It
would be helpful to the test suite process to have the implementors run
the tests and provide feedback to us.  Using this feedback, we can generate
discussion on whether there is a problem with the spec, the test, etc.

It might be helpful if we could define a way to report the results in xml.
Then we could write a transform that showed how the implementations
did on each test -- this would be useful in resolving issues.  I came across
the following article on the junit web site.  It outlines an approach
to ours, except that it captures the results in an xml file, and then uses a
transformation to display a nice test report.


Any thoughts?

> 2. Provide a simple transform to read a spec and do a smoke test;
> ripping out tests on each interface with all its methods and attributes,
> say. This would greatly enhance coverage, on the one hand, but would
> also serve as a good starting point for tests that could be further
> enhanced. It could also serve as the basic functionality tests on each
> module that the WG wants to see for level 3.

Yes, this would be a logical next step -- to be able to automatically
a set of tests from the spec.  I've been thinking a bit about this one --
I'll look
into it and see what I can come up with.

> 3. Documentation was asked for, since it is my action item from a long
> time back, I'll see to to provide if not a full documentation, then at
> least a draft in the CVS for completion by all parties that have been
> involved on their particular lines of work.


> 4. Provide dates or version numbers on the tests so that it's easier to
> extract information from running the different versions of the tests
> without having to refer to the version number of the suite as such. Can
> we have another round of packaging and versioning issues on the list?

Anyone know what CVS does by default?  Do we have to put in a version
number or is one automatically generated?

Received on Monday, 19 November 2001 11:25:40 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 20:34:03 UTC