Re: Conformance Disclaimer -- comments please

Replying to both Lynne and David at once...

At 10:38 AM 1/29/03 -0500, Lynne Rosenthal wrote:

>I think we should remove the #2. It only adds to the confusion. 
>Additionally, recently we discussed the conformance disclaimer in SpecGL 
>and removed the paragraph of the disclaimer that talked about not 
>satisfying a CP. So, to be consistent - we should remove this requirement 
>(i.e., delete #2)

I think this is my favorite.

Does anyone object to removing it altogether?  (First, see "is it worth 
saying" comment below.)

About David's comments...

At 10:13 AM 1/29/03 -0500, David Marston/Cambridge/IBM wrote:

> >From the Checkpoint:
> >1. passing all of the tests does not guarantee full conformance of an
> >implementation to the specification
> >2. failing the test suite means failing tests for the specific
> >features they target.
>
>To which I would add, if it's not too late:
>3. passing a given percentage n% of the tests does not mean that the
>implementation is n% conformant.
>Or, more politely:
>3. There is no policy that all test cases have equal weight.

I would suggest not trying to sort this out now, given the timing.  It's a 
little too "new-ish" of an issue.  Could deal with it as a new issue after 
Last Call review starts?


> >a.) What does #2 mean?
>
>My Guess: Failure of some test cases should not be extrapolated to
>imply shortcomings in untested areas.

Okay, that is a clear statement.  But ... is it worth saying here?  Does it 
add anything to our principal goal, which is to avoid confusion between 
pass-all-tests and conform-to-spec?  Or, as Lynne said, does it confuse the 
matter?


> >b.)  Are we trying to say (disclaim) something like, "If you fail some
> >tests and therefore fail the test suite, don't try to draw any
> >conclusions beyond the scope of the specific features targeted by the
> >test suite."?  And is that true?!
>
>I think so. And also, don't use percentages. I'd say you can't draw
>broader conclusions based solely on test results, but having the
>product in your lab means you have other information about it.
>
> >c.)  Isn't it true that failing one specific-feature test for a MUST
> >requirement of the specification means that the implementation does
> >not conform to the specification?
>
>Yes.
>
> >Maybe that does not sound like "disclaimer",
> >but if it is true, why aren't we saying that?
>
>I think the motivation is more about competitors reporting results on
>each other in a hostile situation. The WG would like to push vendors
>to improve their conformance rather than use the test suite as a
>weapon. Also, the fact that most implementable (non-foundation) specs
>get errata over time means that the spec and the test materials are
>both fallible, so a disclaimer is appropriate.

Bottom line, again.  Any objections that we just eliminate #2 for now, and 
not replace it with anything?

-Lofton.

Received on Wednesday, 29 January 2003 13:11:52 UTC